Last week Bloomberg ran the headline: "The AI Job Apocalypse Is Being Delayed." Fortune had the Pearson CEO calling the whole thing a Silicon Valley story with no data to back it. On X, a thread cutting through Anthropic's actual labour market research went viral - the conclusion was quieter and more unnerving than the panic: AI isn't replacing jobs at scale yet, but it's already reshaping who gets hired and which roles quietly stop growing.

Meanwhile, every AI CEO on every podcast continues predicting civilisation-level disruption. None of them are wrong to be excited. But they're also not wrong to be incentivised.

Here's the thing most business owners miss: the apocalypse narrative isn't a prediction. It's a sales strategy.

When Sam Altman says AI will replace 90% of jobs, he isn't speaking as a detached economist. He's raising capital, recruiting engineers, lobbying regulators, and building a perception of inevitability that makes adoption feel like survival. That's not cynicism - it's just how the game works. Fear sells. Urgency converts. The bigger the threat you define, the more essential your solution becomes.

Scott Galloway, in a recent episode of Diary of a CEO, put the incentive structure bluntly: the AI job-apocalypse story is partly marketing theatre. The rich are buying the assets. Everyone else is buying the anxiety. That's not a conspiracy - it's just how capital has always worked when technology shifts the ownership of leverage. The people who own the infrastructure win. The people who rent their labour without distribution, relationships, or accumulated judgement absorb most of the disruption.

The practical read for any business right now is this: stop asking "will AI take our jobs?" and start asking "who benefits from us believing that it will?"

Because the real risk isn't the apocalypse. The real risk is panic-buying AI tools to solve problems you haven't properly diagnosed, dismantling capabilities that are actually competitive advantages, and handing your strategic thinking to software that has no stake in your business outcomes.

What actually holds value when software gets cheap? The same things that always held value when a powerful new tool arrived: better briefs, sharper commercial judgement, genuine client trust, the ability to tell a story someone wants to believe, and the resilience to keep selling through rejection. None of that lives in a model. It lives in the people running the business.

This doesn't mean ignore AI. It means use it with adult supervision. The operators winning right now aren't the ones who swallowed the apocalypse pitch whole - they're the ones who picked up the specific tools that reduce friction on work they were already doing well, and kept their human judgment running alongside it.

The distinction matters for every business making AI investment decisions in 2026. You're not choosing between "AI or no AI." You're choosing between two versions of adoption: one where you chase the narrative, and one where you actually understand what problem you're solving.

The narrative serves the sellers. Your business serves your customers. Make sure you know which one you're optimising for.