AI is hiring while replacing work
Are we building real expertise with emerging technology, or just passing through a transition phase?
Someone got mocked on social media for hiring a “token optimization specialist”.
The pile-on was predictable. If AI replaces humans, why are we hiring more people to manage it? Ha ha, tech bros, etc.
Here’s the thing though.
The obvious part first
The mockery is lazy, but so is the defense.
When something gets cheaper to produce, you don’t produce less of it. You produce way more. AI makes text, code, and images cheap, so now there’s more of all three and more of everything creates new problems. API bills, inconsistent outputs…
Someone has to care about that stuff. New roles emerge. This is just how it works.
Cloud computing didn’t eliminate infrastructure headaches, it created cloud architects.
Social media didn’t simplify communications, it created a whole profession.
Fine. That’s the easy answer.
But it’s also not simply “AI will create new jobs, so relax”
Because some of these jobs have an expiration date, and the cycle is moving fast enough that you might not even see it coming.
In 2020, GPT-3 came out. And I don’t mean ChatGPT. There was no interface, no “talk to an AI.” You had to apply for API access. Write a pitch explaining who you were, what your company did, why your use case was legitimate. Then wait.
When we finally got in, we were excited. We were building some app ideas and needed content generated at scale. GPT-3 was powerful but unpredictable. You couldn’t just ask it something and trust the output. You needed someone who really understood how to coax it.
We found a guy. He was technically skilled, almost artisanal about it, crafting prompts with this specific structure and logic that got GPT-3 to produce what we actually wanted. We still had to do a lot of cleanup because the results were nowhere near perfect.
It was real work, and he was really good at it.
It was also obsolete within two years.
I’m sure he found new ways to specialize. But the specific skill that made him valuable in 2021 got swallowed by the next wave of the technology. The thing he knew became something the model just did on its own.
So when someone says “prompt engineer” like it’s a stable career path being built right now, maybe. But maybe it’s also just another two-year window.
Nature of the new work itself
More AI-generated content means more noise, which means we need people to filter and verify. More AI-generated code means more bugs, which means we need more oversight. More scale means more cost management.
That’s real work. But there’s a difference between jobs that unlock new capability and jobs that exist because we created a mess. Both count. They’re not the same thing.
The “people said the same about SEO” defense is also doing too much work. The idea that SEO specialists or social media managers both sounded like made-up jobs before they became standard.
Sometimes that’s true, new categories of work look absurd before they become obvious. But sometimes new roles are inflated, or reactive, or straight-up corporate theater. Asking which is which is just a reasonable question.
So no, hiring a token optimization specialist isn’t ridiculous. Something real is forming.
But “technology creates jobs” is an aggregate truth that skips over who gets those jobs, how long the gap lasts, and who absorbs the cost while we wait for the economy to sort itself out.
Who actually gets these jobs? Because it’s not the same people. The junior dev whose tasks got automated isn’t the one optimizing inference costs at a startup. The content writer whose work got replaced isn’t pivoting to “AI output strategist”.
The economy adapts as an aggregate. Individuals don’t automatically adapt with it, and adaptation costs money, time, and access that aren’t evenly distributed.
The economy will figure it out eventually. It always does. The question is just who’s carrying the weight in the meantime, and whether we’re even willing to name them.

