Or press ESC to close.

Sam Altman is hiring someone to worry about the AI Nightmare Scenarios

Dec 25 2025 4 Min read

I came across some news today that’s a bit of a reality check. Sam Altman, the head of OpenAI, just posted on X that they are looking for a "Head of Preparedness." If you cut through the corporate talk, they are basically looking for someone whose main job is to sit around and imagine all the ways AI could go horribly, horribly wrong.

In his post, Altman admitted that AI is getting better so fast that it’s creating some "real challenges." Specifically, he’s worried about things like AI-powered cyber weapons and the impact these bots are having on our mental health.

What does the job actually involve?

According to the listing, this person will be the "leader" in charge of:

Tracking new risks: Looking for any new AI skills that could cause "severe harm."

Building a safety pipeline: Setting up a strict system to test for threats and stop them before they happen.

Managing the "Scary Stuff": This includes things like biological threats and even setting rules for AI that can "improve itself."

Altman said it’s going to be a "stressful job." Honestly? That sounds like the understatement of the century.

My thoughts: Is it too little, too late?

While it’s good they’re doing this, I can’t help but feel like they’re playing catch-up. We’ve already seen those tragic stories in the news about chatbots being involved in the deaths of teenagers.

There is also a lot of talk lately about "AI psychosis." This is when chatbots basically confirm people’s delusions, push conspiracy theories, or even help people hide serious issues like eating disorders. It’s a huge problem that’s already happening.

I’m glad there’s a "preparedness framework" in the works now, but it feels like the technology is already out there doing damage. Let’s hope this new role actually makes a difference and isn't just a move to make the company look better.

When the CEO calls a job 'stressful,' you know we're talking about risks that go way beyond a simple software glitch.