Thought Leadership

The Strange Grief of Becoming Unnecessary

What happens psychologically when you watch AI master your skills in real-time? A reflection on obsolescence, identity, and what we're really mourning.
February 16, 2026 · 11 min read
TL;DR: The psychological experience of watching AI master your skills is stranger than the economic predictions suggest. It feels like grief, but for something you haven't lost yet. Tech workers are going through it first, but the emotional pattern will repeat across every industry. Understanding this process may matter more than any career advice.

A few weeks ago, Matt Shumer, co-founder of an AI startup, published an essay that went viral. The title was simple: "Something Big Is Happening." In it, he described a moment that thousands of people in tech have experienced but few have articulated: the realization that you are no longer needed for the actual technical work of your job.

"I describe what I want built, in plain English, and it just... appears," he wrote. "Not a rough draft I need to fix. The finished thing."

I know exactly what he means. I run AI agents for real work. I've watched them go from useful assistants to something approaching colleagues. The experience is disorienting in ways that career counselors and LinkedIn thought leaders seem completely unprepared to address.

Because the thing nobody tells you about becoming unnecessary is that it doesn't feel the way you expect. It doesn't feel like losing a job. It feels like losing a piece of yourself.

The Kübler-Ross of Capability

When Elisabeth Kübler-Ross wrote about the five stages of grief in 1969, she was describing how people process death. Denial, anger, bargaining, depression, acceptance. It turns out this framework applies reasonably well to watching your professional skills become obsolete in real-time.

83% of workers believe AI will significantly change their jobs within 5 years (Pew Research, 2025)

The denial stage is familiar. "AI makes mistakes. It hallucinates. It can't really understand context." All true, sometimes. But each month, the list of things AI "can't really do" gets shorter. The denial starts to feel like checking your pulse to prove you're alive.

The anger stage surprises people because they don't recognize it as anger. It manifests as irritation at AI hype, at colleagues who seem too enthusiastic about the technology, at think pieces that feel breathless or naive. The anger isn't really about AI. It's about the uncomfortable sensation of the ground shifting under your feet.

Bargaining sounds like "AI is just a tool, and tools need human operators" or "AI will augment us, not replace us." These statements might even be true. But when you find yourself repeating them like a mantra, you've entered bargaining territory. You're negotiating with reality.

The depression stage is quieter. It shows up as a loss of motivation, a vague sense that your work doesn't matter the way it used to, a difficulty explaining to people outside your field why you feel so unsettled when you still have a job and a salary. This is the stage where people start doom-scrolling AI Twitter at 2 AM, looking for evidence that the future isn't as severe as it feels.

Acceptance is not resignation. It's something else entirely, and I'll get to it. But first, we need to understand what we're actually mourning.

Identity Is the Real Casualty

Here's what the economic analyses miss: professional skills aren't just things you do. They're part of who you are.

If you've spent fifteen years becoming an expert at something, that expertise shapes your identity. It determines how you introduce yourself at parties, how you answer your kids when they ask what you do all day, how you think about your value in the world. When someone says "I'm a software engineer" or "I'm a financial analyst" or "I'm a radiologist," they're not just describing their job. They're describing themselves.

Key Insight: The grief of becoming unnecessary isn't primarily about money or job security. It's about identity. We mourn the person we spent years becoming, even when that person still exists.

This is why the standard career advice feels so hollow. "Just upskill." "Learn to use AI tools." "Become an AI-human hybrid worker." These suggestions might be practically useful, but they completely miss the psychological reality. You can't upskill your way out of an identity crisis.

I've talked to programmers who feel a strange emptiness when they realize they can describe a complex system in a paragraph and watch it materialize without writing a single line of code. The code was the point, they tell me. The craft was the point. Being able to produce the same outcome without the craft feels less like progress and more like loss.

A lawyer friend described it differently. "I spent a decade learning to read case law, to spot the relevant precedent, to construct arguments that hold up under scrutiny. Now I can ask Claude to do that in thirty seconds, and it's better than my first draft." He paused. "I know I should feel liberated. But I just feel... smaller."

The Acceleration Problem

What makes this moment particularly disorienting is the pace. Previous technological transitions gave people time to adapt. The shift from typewriters to word processors happened over decades. Accountants had years to migrate from ledgers to spreadsheets. Even the internet, disruptive as it was, transformed industries over a twenty-year arc.

18 months Time between GPT-4 and current frontier models
12-18 months Microsoft AI CEO's prediction for white-collar automation
0 Decades available for gradual adaptation

Mustafa Suleyman, Microsoft's AI CEO, recently told the Financial Times that AI will automate most white-collar work within 12-18 months. Whether that timeline is accurate is almost beside the point. What matters is that the people building these systems think the transformation is imminent. And they're acting accordingly.

This speed doesn't give our psychology time to catch up. Normally, when a profession changes, people have years to adjust their identity alongside their skills. They can gradually incorporate new tools while maintaining a sense of continuity with their past selves. The accountant who adopted Excel in 1990 was still an accountant. The journalist who learned to write for the web in 2005 was still a journalist.

What happens when the change is so fast that there's no continuity to maintain? What happens when Monday's expertise becomes Wednesday's automation?

The honest answer is: we don't know. We're running a psychological experiment on an entire civilization, and no one asked for consent.

The Practitioners' Paradox

Here's an irony that keeps me up at night. The people who are most clear-eyed about what's coming are often the people using AI most heavily. They see the trajectory because they're living it.

Shumer's essay resonated because he wasn't making predictions. He was describing what already happened to him. The AI safety researchers leaving their jobs at major labs aren't speculating about future risks. They're responding to things they've already seen.

The paradox is that becoming proficient with AI tools often accelerates your confrontation with your own obsolescence. Every time you discover that AI can handle something you thought required your human judgment, you update your model of what's left that's uniquely yours. And the "uniquely yours" territory keeps shrinking.

A Reframe: This is uncomfortable, but it might also be useful. The people who confront this reality early have more time to figure out what comes next. Denial is comfortable, but it's also a form of unpreparedness.

I've noticed that the most psychologically healthy practitioners aren't the ones who deny what's happening, and they're not the ones who catastrophize about it either. They're the ones who've decided to ride the wave while staying curious about what they'll become on the other side.

What We're Actually Mourning

Let me be precise about this, because precision matters.

We're not mourning our jobs, at least not primarily. Jobs can be replaced. We're not mourning our skills, because skills were always temporary. The blacksmith's skill with an anvil didn't make him less of a person when factories appeared; it just made him someone whose valuable skills had changed.

We're mourning something more fundamental: the story we told ourselves about why we mattered.

If you became good at something difficult, you probably built an identity around that difficulty. The years of practice, the accumulated expertise, the hard-won intuition. These things felt earned. They felt like proof of something essential about you.

And now you're watching a system match that expertise in seconds, without the years, without the struggle, without any sense of having earned it. The system doesn't even know what "earning" means.

"I keep giving them the polite version. The cocktail-party version. Because the honest version sounds like I've lost my mind."
Matt Shumer, on describing AI progress to people outside tech

This is what makes the current moment feel so surreal. We're being asked to decouple our sense of self-worth from our productive output, and we have no cultural template for how to do that. The self-help books haven't been written yet. The therapy frameworks don't quite fit. We're improvising.

The Liberation Nobody Talks About

I want to be honest about something else, something that feels almost taboo to admit: there's also something liberating about this.

For years, my sense of self was wrapped up in what I could produce. The quality of my work was the quality of me. That's an exhausting way to live, even when you're successful. Especially when you're successful, because success raises the bar you have to clear tomorrow.

Watching AI take over tasks that I used to identify with has forced me to ask questions I'd been avoiding. What do I actually enjoy versus what I do because I'm good at it? What parts of my work are meaningful versus what parts are just productive? What would I do if I didn't have to prove my value through output?

These are uncomfortable questions. They're also, arguably, the right questions.

The strange thing about grief is that it often precedes growth. You have to lose something before you can find something new in its place. The identity built around professional skill was never going to last forever anyway. AI just forced the transition to happen faster than anyone expected.

Key Insight: The crisis of becoming "unnecessary" might also be an opportunity to become something else. Not necessarily something better. Just something different, and perhaps more honest about what human life is actually for.

Finding Footing in the Fog

I don't have a tidy conclusion. Anyone who offers one right now is either lying or hasn't understood the scope of what's happening.

But I'll share what's helped me.

First: name the grief. The feeling of watching your skills become obsolete is a real loss, and it deserves to be acknowledged as such. Pretending it's fine when it isn't fine doesn't make you resilient; it makes you brittle.

Second: talk to others going through it. The isolation of this experience is partly an illusion. Millions of people are having some version of this crisis right now. Finding them doesn't solve the problem, but it makes the problem feel less lonely.

Third: get curious about what comes next. Not in a forced-optimism way, but in a genuine "I don't know what I'm becoming" way. The people who thrive through transitions tend to be the ones who stay curious rather than defensive.

Finally: remember that you are not your output. This sounds like a greeting card platitude, but in a world where AI can match human output, it might be the most practical truth available. Whatever you are, it's not reducible to what you produce. If it were, AI would already be you.

We're all improvising through this. The grief is real. The disorientation is valid. And somewhere on the other side, there's a version of ourselves we haven't met yet.

I don't know what that version looks like. But I'm curious to find out.


This article is part of our series on the human side of AI transformation. If you're navigating similar questions, our community is full of people working through the same transition.

Advertisement

Share This Article

Share on X Share on Facebook Share on LinkedIn
Future Humanism editorial team

Future Humanism

Exploring where AI meets human potential. Daily insights on automation, side projects, and building things that matter.

Follow on X

Keep Reading

Vibe Coding Is a Lie (And Also the Future)
AI Tools

Vibe Coding Is a Lie (And Also the Future)

The truth about Lovable, Bolt, and Replit after building 12 projects with them....

The $700 Billion Bet: What Happens If AI Doesn't Pay Off?
Thought Leadership

The $700 Billion Bet: What Happens If AI Doesn't P...

Big Tech is spending more on AI infrastructure than the GDP of most countries. H...

TIBBIR Is the Only Agent Running on All Four Layers of the New AI Commerce Stack
Technology

TIBBIR Is the Only Agent Running on All Four Layer...

Four infrastructure launches in 14 days built the complete stack for autonomous...

Your AI Agent Forgets Everything. Here Is How to Fix It.
AI Agents

Your AI Agent Forgets Everything. Here Is How to F...

Every AI agent loses its memory when a session ends. Decisions, preferences, pro...

Share This Site
Copy Link Share on Facebook Share on X
Subscribe for Daily AI Tips