“AI is the new cigarette.”
When a colleague said this in the waning days of 2022, days after ChatGPT burst on the scene, she took my breath away. The idea that this miracle would kill us seemed confined to hysterical handwringing foretelling the birth of Skynet.
She was right.
But neither of us knew it was designed to be that way.
Designed for addiction
My friend predicted that ChatGPT would stay free and helpful until usage reached “critical mass,” and then we’d have to pay. Less than three months after its November launch, OpenAI introduced its $20 per month service.
But it’s not the “first one’s free, the next one will cost you” aspect of drugs that makes AI addictive. It’s the design decisions at its core that keeps you coming back:
- Purchase Decoupling in which you convert real money into tokens, creating psychological distance between you and your actual spending
- Difficulty Curve where skills and benefits accumulate quickly giving you the sense that you’re becoming more capable over time and therefore more committed after progress slows.
- Skill Atrophy where every skill you stop practicing because the machine does it for you, quietly disappears.
Even casual AI users have experienced one or more of these:
- You get a message mid-chat telling you you’ve used all your tokens and need to come back in three hours even though you’ve paid your monthly $20 fee
- You’re prompting in all caps because it’s the only way you can think of to get the LLM to stop hallucinating, while reminiscing about the days when it was a brilliant thought-partner
- You’ve relied on AI to outline articles for the last several months, but you need to write in a different style and have no idea how to get started.
And yet, we keep going back.
But it’s not just individuals who are addicted. It’s entire organizations.
Signs that your organization is addicted to AI
Your CFO asks for the total AI spend across the organization. Three weeks and four departments later, the number is three times what anyone expected because the licenses are buried in IT infrastructure budgets, the pilots are expensed as innovation projects, and half the tools were purchased by business units on corporate cards.
The board approved the AI transformation initiative based on the pilot results. Eighteen months later, the pilot case study slide hasn’t changed, headcount has been reduced in anticipation of productivity gains that haven’t materialized, and the team running the pilot has quietly moved on to other work.
You eliminated the analyst pool two years ago because AI could do in minutes what they did in days. Now you need to evaluate whether the AI’s output is actually correct, and you’ve just realized there’s nobody left in the organization to check it because everyone who’s done it is gone.
Sound familiar? Your organization is an addict.
Recovery is possible
Addiction can’t be cured, only managed. The same is true for AI.
The road to recovery starts in a similar place: Visibility
- Centralize AI spending the way you centralize other business processes AND allow some flexibility by setting strict spending limits and clear decision-making criteria and ownership.
- Start pilots with the end in mind by establishing success metrics and scaling plans at the start of the pilot, not when it’s already in process.
- Treat certain human capabilities as strategic reserves the same way you’d treat any critical operational dependency. Before automating a function, explicitly document what judgment and expertise currently lives there, who holds it, and what it would cost to rebuild it if needed.
Unlike cigarettes or gambling, we’ve reached a point where we can’t quit AI.
But we can be aware of our addiction and we must manage it.
The first step is admitting that it’s real. And by design.