Essay · 2026

Why You Should Adapt to AI, Even If You Hate It

There's a specific kind of person who refuses to use AI. They're not unintelligent. Often the opposite. They have principled objections: it's unreliable, it's overhyped, it degrades the craft, it homogenizes thinking. Some of these concerns are legitimate. Most of them are also irrelevant to what's actually happening.

Here's what's actually happening: the people around you are using it. And they're doing in hours what used to take days.

I've seen this pattern before. When spreadsheets arrived, a generation of accountants refused to learn them. The numbers felt less real on a screen, the process felt too automated. Some of them were right that something was lost. But they still lost. The people who learned the new tools got faster, took on more clients, and outcompeted the ones who didn't. The same thing happened with the internet. With email. With every meaningful shift in leverage.

In every case, some people resisted intelligently and articulately. And in every case, the people who adapted won. Not because they were smarter or more principled, but because they had access to more leverage. That's what AI is: leverage. And unlike most tools, this one compounds unusually fast.

The argument I hear most often is something like: "I don't want to produce mediocre work faster." This is a real concern. But it confuses the tool with the person using it. AI doesn't produce mediocre work on its own. People who were going to produce mediocre work anyway now produce it faster. People who were going to do careful, thoughtful work are now doing careful, thoughtful work faster, with better research, with more drafts before shipping. The quality of the output still traces back to the quality of the judgment behind it.

That's the part AI can't replace: taste. Knowing when something is wrong. Knowing when it's flat or generic or missing the actual point. The people most likely to use AI well are the ones who care most about quality, because they're the ones who can tell when the output isn't good enough. The ones who outsource their judgment entirely will produce worse work. But the ones who refuse to touch the tools at all will simply fall behind.

Adapting doesn't mean trusting everything AI produces. It doesn't mean you stop caring about craft. It doesn't mean you endorse the ways it was built or the companies that built it. It means you figure out how to use it in a way that amplifies what you're already good at, before someone who cared less about those things figures it out first.

You don't have to like it. But refusing to adapt isn't a principled stand. It's just falling behind while pretending you chose to.