PWD 2026 Week 2 - AI or Human?

"Most US jobs, the threat to your job is not AI, it’s some other person who uses AI better than you do."
~ Tyler Cowen
This week's PWD will discuss an interview of Tyler Cowen by Molly Wood. Cowen is the Holbert L. Harris Chair of Economics at George Mason University, author and deeply influential public thinker who co-writes the Marginal Revolution blog and hosts the Conversations with Tyler podcast. To listen to the podcast: Microsoft WorkLab - Economist Tyler Cowen on the positive side of AI negativity
Cowen starts off by talking a bit about his past and current views. He famously said that innovation was dead in his book The Great Stagnation (2011), and he provides some additional context that in one of his later books, said that AI was going to pull us out of it. He also says Biomedical and AI advances shows that we are back on the track of progress. I'd certainly agree that research and innovation in these fields has been meteoric in the past five years, and we are just now starting to see the effects of this progress on society. Cowen also discusses this and notes that AI tools have had a big impact on a personal level but not on a societal level (GDP, productivity growth).
From Bricklayer to Architect
As a developer myself, I think that AI has raised the ceiling on the amount of work that can be produced by an individual. Specifically, the time taken to produce work has been reduced and through agentic AI, automated in some cases. AI is only useful in that context if you know what you are doing - its all too easy to paste your requirements into Gemini or Cursor, and if a seasoned developer were to look at the code, they would not pass it. The human element of coming up with a proper solution architecture, the tech stack itself and understanding the full technical and business context of the problem are some of the limiting factors when asking AI to write code. Even the context window itself is a limiting factor and the AI may not even be able to see all of the code or have enough tokens to produce a solid output. I find myself having to provide Cursor with very specific requirements, meticulously checking the code afterwards or taking a different approach.
For a university graduate trying to break into a technical role, I ask myself the question, how would they learn how to use AI responsibly in a production workflow and leverage its advantages? Subsequently, how can they demonstrate their value to a hiring manager? My answer would be to return to basics and to see the bigger picture. The human element of software development cannot be understated. As Cowen suggests, the ability to write is becoming the ability to think. The code itself is becoming a commodity, while the ability to orchestrate that code into a cohesive solution is where the true value lies. Developers are moving from being bricklayers to architects, and the juniors who succeed will be the ones who understand how the building stands up, not just how to mix the mortar.
Legacy Systems Will Fail
While the struggling university graduate is figuring out these new AI workflows, the institutions meant to train us and the companies meant to hire us are lagging behind. Cowen notes that existing companies, non-profits or universities mostly fail when trying to reorganise themselves around AI.
"Think back to the earlier years of the American auto industry. Toyota is a threat to General Motors. Try to get General Motors to do the good things Toyota was doing, that’s arguably a much simpler problem, but mostly they failed."
This strikes a chord with me. GM knew what Toyota was doing right but they could not replicate Toyota. They have union contracts, existing factory floor layouts, and thousands of people in their organisation - such a change would have been near-impossible. Similarly, companies today are trying to 'sprinkle' AI on top of their archaic workflows - they want to have their cake and eat it too. If you want the efficiency of an AI-native firm, you need to be an AI-native firm, not a legacy organisation.
Flipping the Syllabus
Cowen also notes that "we do not have the professors to teach AI literacy", and that its hard to teach AI skills because everything is advancing so rapidly. The bottleneck isn't just corporate, its academic. Checking code meticulously and knowing what to look for is a skill learned through doing. I'd argue that the pedagogy of teaching is not currently fit for the 'adaptive skills' that AI requires. A transition from fixed skills (how to use Excel or Python) to dynamic reasoning (how to formulate the right questions) and intellectual agility (how to unlearn and relearn quickly) will need to be the focus of modern curriculums. In effect, modern education needs to treat every student as a Technical Lead from day one - evaluation and analysis being the key skills here.
Charisma as Currency
So code is becoming a commodity and institutions are too slow to adapt. What is actually scarce in the job market? Cowen's answer: charisma, physical presence and the ability to persuade. Today's world lets anyone generate a perfectly written cover letter and smash any take-home assessment. That means those metrics no longer matter - they don't prove anything.
"Who can actually vouch for you, recommend you, speak to what you’ve done with them or for them?"
This is the all-important human premium. We are reverting to a relationship-based economy where your ability to build a network and communicate complex ideas face-to-face matters more than your ability to grind out code at home. If the AI can build the solution, the most valuable person in the room becomes the one who can convince the stakeholders to adopt it. Cowen also suggests a counter-intuitive metric for tracking this progress: disorientation. He argues that if people are unhappy, confused, and complaining on Twitter (X), it is actually a sign that the technology is working. It is a harsh truth, but one I see in the developer community daily. The anxiety we feel - Am I learning the right stack? Is this tool going to replace me? Are these certifications even going to matter? - is the visceral evidence of the productivity boom we are waiting for. If we were comfortable, it would mean we were stagnating. The friction is the point.
Don't Wait For Them
For those of us in the trenches, university graduate or a senior engineer, the message is clear. We cannot wait for our companies to become "Toyota" or for our universities to update the curriculum. By the time they do, the tools will have changed again and a new start-up will be targeting a 1 trillion dollar IPO. The responsibility rests on us to become the architects of our own workflows, the sceptics of our own tools, and the builders of our own networks. The AI can write the code, but it cannot shake a hand, it cannot align a stakeholder, and it certainly cannot decide what is worth building in the first place.
That part is still up to us.