From “Transform or Die” to Threat → Agency: The Human Side of AI in 2026
A thought experiment:
If you introduce AI in your company like a compliance topic — mandatory trainings, a bit of pressure, a bit of vague “transform or die” energy — what exactly do you expect people to learn?
Often they learn avoidance. Or cynicism. Or quiet, professional resistance.
And that response isn’t irrational. Research (and a little bit of history) suggests it’s… pretty normal.
Big shifts trigger big feelings — and AI is one of those rare topics that can spark excitement and alarm in the same meeting. In surveys, large parts of the public even describe themselves as more concerned than excited about AI’s growing role in daily life.12
The worries are also remarkably consistent across studies:
- job displacement
- loss of human creativity / “human touch”
- biased or untrustworthy outputs
- privacy and data leakage
- loss of control (over work, reputation, security) 3
What’s useful (especially if you lead teams) is this: it’s not just a rational risk list. It’s also an emotional change curve.
Two classic models are surprisingly good lenses here:
- Lewin’s Unfreeze → Change → Refreeze: people first have to let go of old assumptions, then live through a messy in-between, and only later does a new normal stabilize.4
- Kübler-Ross-style stages (borrowed from grief): denial, anger, bargaining, discouragement/depression, acceptance — with overlap, loops, and back-and-forth.5
So when your org feels “in flux” about AI, it might not be a failure.
It might be the process.
One framing that helps me: early resistance is often data. Not something to crush — something to listen to, because it points to what people need clarified before they’ll move: safety, ethics, bias, accountability, reskilling, and the boundaries of “allowed” experimentation.6
So the question becomes less “How do we roll out AI?” and more:
How do we help people move from threat → agency? How do we make room for fear without freezing progress? How do we avoid both extremes — doom and hype — and stay practical?
If you’re leading this topic in 2026, I’d love to hear: what’s the biggest friction in your org right now — data/privacy, output quality, or the human stuff nobody says out loud? 🌈
Read more
- How Americans view AI and its impact on people and society — Pew Research Center (Sep 17, 2025). (Pew Research Center)
- How people around the world view AI / concern vs. excitement across countries — Pew Research Center (Oct 15, 2025). (Pew Research Center)
- Trust, attitudes and use of artificial intelligence (Global report) — KPMG (May 2025). (KPMG)
- What is Lewin’s Change Theory? (Unfreeze–Change–Refreeze) — Prosci (Oct 1, 2024). (prosci.com)
- Kübler-Ross Change Curve® (non-linear stages applied to change) — Elisabeth Kübler-Ross Foundation. (ekrfoundation.org)
- Decoding resistance to change — Harvard Business Review (Apr 2009). (hbr.org)
MAKE YOUR CASE.
