Open notebook with handwriting beside a cup of coffee

I didn’t plan to write a book. It started as a question I couldn’t stop asking.

The question was simple, and it ruined my sleep for months: If a machine can write, paint, compose, code, and diagnose — what’s left for me?

I wrote that question in a notebook at 2 a.m. in San José, Costa Rica, in the middle of a power outage. No screens. Just a pen and the sound of rain on a tin roof. By the time the lights came back, I had four pages of handwriting that would eventually become The Last Skill: What AI Will Never Own.

Here is what the writing taught me.


Lesson 1: The fear is real

I started with research. I thought the data would give me distance from the question, the way numbers sometimes let you stand outside a feeling and look at it calmly.

It did the opposite.

Forty-one percent of workers worldwide believe AI will make their jobs obsolete within five years. Not “might change their work.” Obsolete. Therapists across the United States and Europe report a surge in patients presenting with what they’re calling “FOBO” — fear of becoming obsolete. Creative professionals who spent decades building a craft are watching entry-level positions evaporate. Entire categories of work that existed eighteen months ago simply don’t anymore.

I had expected the statistics. I had not expected them to hit me in the chest.

Writing about fear made the fear worse before it got better. Every study I read, every interview I conducted, every graph I plotted confirmed the same thing: this is not a drill. The displacement is happening now, to real people, in real offices and studios, and the timeline is not “someday.” The timeline is Tuesday.

I almost quit the manuscript twice. Both times for the same reason — I felt like I was documenting a house fire from inside the house.

But here’s what kept me going. Every person I talked to who was afraid also said the same thing, unprompted: Nobody is talking about this honestly. The tech industry was celebrating. The media was cycling between panic and hype. And the people actually living through the shift felt invisible.

That made me angry enough to keep writing.


Lesson 2: AI is better at writing than I expected — and worse at thinking than I hoped

I used AI tools during the writing process. Of course I did. It’s 2026.

I used them for research summaries, for brainstorming chapter structures, for generating first-pass descriptions of studies I wanted to reference. And I will admit something that most authors won’t: the output was often impressive. Clean. Coherent. Fast.

But then I’d sit with the text, and something would nag at me.

The AI could write a paragraph about fear that read perfectly well. Every sentence was grammatical. The transitions were smooth. The vocabulary was appropriate. And it meant absolutely nothing. There was no weight behind the words. No one had stayed up until 2 a.m. earning those sentences. The paragraph was a performance of understanding without the understanding itself.

This taught me the exact contours of the gap between generating and authoring. Generating is pattern completion — statistically plausible sequences of language. Authoring is putting something at risk. It’s deciding what matters, what to leave out, what hill to die on. It’s the willingness to be specific when vagueness would be safer.

Every time I accepted an AI draft without rewriting it, I could feel the manuscript go flat. Every time I threw it away and started from my own confusion, the writing got better. Not because my prose was more polished — it often wasn’t — but because there was a person behind it who was actually trying to figure something out.

The gap between generating and authoring became one of the central arguments of The Last Skill. I didn’t plan it. The writing process itself revealed it.


Lesson 3: The book I needed didn’t exist

Before I started writing, I read more than fifteen books about AI. I read the optimistic ones and the apocalyptic ones. I read the ones by engineers and the ones by philosophers. I read the ones for executives, the ones for policymakers, the ones for developers.

None of them were for me.

None of them were for the person lying awake at 3 a.m. wondering if they still mattered.

The executive books assumed you had leverage — a budget, a team, a strategy to deploy. The policy books assumed you had influence — a seat at a table where decisions get made. The technical books assumed you wanted to build the thing rather than survive it. And the philosophical books, while beautiful, floated above the ground where most people actually live.

What I couldn’t find was a book that started with the emotional reality. A book that said: Yes, you are right to be afraid. Now let’s figure out what the fear is actually about, and whether there’s something on the other side of it.

So I wrote that book. Not because I had the answers, but because I needed them. The best reason to write anything, I think, is that you went looking for it on the shelf and it wasn’t there.


Lesson 4: Putting your name on something in 2026 is an act of courage

This one surprised me.

Somewhere around the third draft, I realized that the act of publishing a book under my own name — my real name, attached to my real opinions, carrying my real mistakes — had become a radical thing to do.

We live in an age of generated content. Entire blogs, social media accounts, marketing campaigns, and customer support systems run on text that no one wrote. The words exist. They were produced. But nobody stands behind them. There is no name. No reputation at stake. No one who will flinch if the text turns out to be wrong.

AI is never wrong in public. It has no public. It produces language without exposure. It generates opinions without holding them.

When you put your name on a book, you are doing the opposite. You are saying: This is what I believe, and I’m willing to be identified with it. Judge me. You are accepting the possibility of being wrong, being criticized, being misunderstood. You are choosing a position in a world that increasingly rewards having none.

I think about this a lot now. I think about what it means that the most human thing about writing was never the writing itself — it was the willingness to be accountable for it. The craft matters, yes. The style matters. But underneath all of it, there is a person who signed their name and said: This one is mine.

That cannot be automated. Not because the technology is insufficient, but because accountability requires a self. A name. A life that will absorb the consequences of having spoken.


I finished The Last Skill in late 2025, and it came out feeling different from the book I thought I was writing. I thought I was writing a book about AI. I wrote a book about what it means to be a person who makes things in an era when making things is no longer a uniquely human act.

The Last Skill isn’t a book about AI. It’s a book about you. About the part of you that knows the difference between a thing that was produced and a thing that was meant. About the stubborn, irrational, magnificent insistence on being the one who says it — even when a machine could say it faster, cheaper, and with fewer typos.

If that sounds like something you need right now, I wrote it for you.

Related reading


Juan C. Guerrero is the founder of Anthropic Press and the author of The Last Skill: What AI Will Never Own. Born in Costa Rica, he believes human authorship remains the primary fact of any universe worth understanding.