Question mark drawn in pencil on cream paper

Open any newspaper, any comment section, any Slack thread where someone just watched their company deploy a new AI tool, and you’ll hear the same question: Will AI replace my job?

I’ve heard it from copywriters, accountants, radiologists, junior developers, paralegals, and — in one memorable conversation — a portrait painter who hadn’t touched a computer since 2014. The question comes from everywhere, and it always carries the same undertone of dread. It sounds reasonable. It sounds urgent. And it is, I think, almost entirely the wrong thing to be asking.

Not because AI won’t change work. It will. It already has. The wrong part is the framing. “Will AI replace my job?” treats you as a passive object — a thing that either gets kept or discarded by forces beyond your control. It assumes you are your job title, that your value is your function, and that the only question left is whether a machine can perform that function cheaper.

That framing leads exactly where you’d expect: anxiety, paralysis, doom-scrolling through LinkedIn posts about prompt engineering bootcamps at 1 a.m.

The better question — the one I wish more people would sit with — is this: What kind of person do I want to be in a world where machines handle the routine?

The first question produces helplessness. The second produces agency. And the distance between those two outcomes is the real story of AI and work that almost nobody is telling.

The identity problem disguised as an employment problem

Here’s what I think the AI-and-jobs panic is actually about, underneath all the economic modeling and the McKinsey reports: for a huge number of people, their identity is their job. Ask someone who they are at a dinner party and they’ll tell you what they do for a living. Their competence at work is their self-worth. Their professional title is their introduction to the world.

That’s not a character flaw. We built a culture that encouraged it. Decades of “do what you love” rhetoric fused personhood with profession so completely that threatening someone’s job now feels like threatening their existence. AI didn’t create this fragility. It’s just the first thing powerful enough to expose it at scale.

A Gallup survey found that 41% of workers globally fear AI will make their roles obsolete. Forty-one percent. That’s not a labor market statistic. That’s a mental health crisis hiding in an economics wrapper. Therapists across the U.S. and Europe report a surge in clients presenting with what some are now calling “FOBO” — fear of becoming obsolete. People aren’t just worried about their paychecks. They’re worried about whether they’ll matter.

And you can’t solve an identity crisis with a retraining program.

The ATM lesson nobody remembers

When the automated teller machine rolled out in the late 1970s, the prediction was obvious: bank tellers would vanish. Machines could count cash faster, work 24 hours a day, and never miscounted a twenty. The tellers were finished.

Except they weren’t. Between 1980 and 2010, the number of bank teller jobs in the United States actually increased. The reason was counterintuitive but, in retrospect, straightforward. ATMs made it dramatically cheaper to operate a bank branch. So banks opened more branches. More branches meant more tellers. But — and this is the part that matters — the nature of the work changed. Tellers stopped counting cash all day and started doing something ATMs couldn’t: building relationships with customers, selling financial products, solving problems that required a human reading a human.

The jobs didn’t disappear. They transformed. The tellers who thrived were the ones who could do the thing the machine couldn’t. The ones who struggled were the ones whose entire skill set was cash counting — the routine the machine was built to absorb.

Every serious economist I’ve read on AI and employment tells some version of this story. The likely outcome isn’t mass unemployment. It’s mass transformation. Tasks within jobs will be automated. The jobs themselves will shift shape. Some will disappear entirely, yes. But new ones will emerge — ones we can’t name yet, the same way nobody in 1995 could have predicted “social media manager” or “UX researcher” as a career.

But here’s the part that keeps me up at night: transformation without support is just a polite word for abandonment.

People aren’t scared of change. They’re scared of being left behind while the change happens. And that fear is completely rational.

We have a track record on this. When manufacturing automated, we told displaced workers to “learn to code.” When coal mines closed, we told entire communities to “retrain for the green economy.” The advice was technically correct and practically useless. A 55-year-old machinist in Ohio does not need to hear that Python is the future. He needs a bridge — financial, emotional, structural — between the world he trained for and the world that’s arriving.

The real threat of AI isn’t that it will eliminate all jobs. It’s that it will transform jobs faster than our institutions can support the humans inside them. The technology moves at the speed of venture capital. The support systems move at the speed of Congress. That gap is where people get hurt.

What actually makes you irreplaceable

So if the answer isn’t panic, and the answer isn’t “learn to code,” and the answer isn’t “become an AI expert” — what is it?

I’ve spent the last two years thinking about this question, and I wrote The Last Skill partly as my attempt to answer it. The conclusion I keep arriving at is both simple and uncomfortable: the things that will make you irreplaceable in an AI-saturated economy are not skills in the traditional sense. They’re capacities. Human capacities. The ones that require a body, a history, a set of relationships, and a genuine stake in being alive.

I’m talking about the ability to sit with someone in genuine distress and make them feel less alone — not because you’ve computed the optimal empathetic response, but because you’ve been in distress yourself and you carry that knowledge in your nervous system. I’m talking about the willingness to make a judgment call when the data is ambiguous and the stakes are real — to put your name on a decision that might be wrong. I’m talking about the capacity to build trust over time, through consistency and vulnerability and the kind of presence that can’t be faked because it requires actually being there.

No amount of compute can simulate what it means to risk something. A language model can generate a business plan, but it can’t mortgage its house to fund it. An AI can draft a eulogy, but it didn’t lose the person. It can write an apology, but it has nothing to be sorry for. These aren’t edge cases. They’re the center of what makes human work human work.

The nurse who holds your hand before surgery. The teacher who notices a kid is off before the kid says a word. The manager who has a hard conversation because the easy one would be dishonest. The founder who makes a bet not because the spreadsheet says so but because something in her gut — built from years of pattern recognition that no training dataset could replicate — says this is the moment.

These are not soft skills. I hate that term. There is nothing soft about the ability to hold contradictions in your head, tolerate ambiguity, and act anyway. That is the hardest thing humans do. And it is the thing machines are structurally incapable of doing — not because they’re not smart enough, but because they don’t have skin in the game. They have no game. They have no skin.

The conventional advice right now is to “become AI-literate.” Fine. Learn how the tools work. That’s table stakes. But if you stop there, you’re just becoming a more efficient operator of a machine that will get more efficient without you. The real move is to develop the capacities that become more valuable as automation increases — precisely because they’re the ones automation can’t touch.

Think about it this way: the more AI-generated content floods the market, the more we crave something that feels unmistakably human. The more chatbots handle customer service, the more we value the moment a real person picks up the phone and actually listens. The more algorithmic everything gets, the more we’re willing to pay for genuine judgment, authentic connection, and the kind of creative weirdness that only comes from a life actually lived.

Scarcity drives value. AI is making routine cognitive labor abundant. That means the scarce thing — the thing that commands a premium — is everything routine cognitive labor isn’t.


I don’t know exactly what the labor market will look like in ten years. Nobody does, and anyone who claims otherwise is selling something. But I’m fairly confident of this: the people who fare best won’t be the ones who ran fastest on the treadmill of technical upskilling. They’ll be the ones who, when the machines took over the routine, had already built something inside themselves that the machines couldn’t reach.

So stop asking whether AI will take your job. Start asking what kind of person you’re becoming while everyone else is distracted by the question. Build the capacities that matter. Invest in the relationships that ground you. Do the things that scare you — not because a career coach told you to, but because fear is a reliable signal that you’re operating in territory no algorithm has mapped.

The future of work isn’t a problem to be solved. It’s a question to be lived. And the question was never about the machines.

It was always about us.

Related reading


Juan C. Guerrero is the founder of Anthropic Press and the author of The Last Skill: What AI Will Never Own. He writes about what remains human in the age of artificial intelligence.