There's an uncomfortable truth brewing in the startup world: AI just eliminated most entry-level jobs, and we're pretending it didn't happen.
The data paints a stark picture.Big Tech companies reduced the hiring of new graduates by 25% in 2024 compared to 2023, while the unemployment rate for recent college graduates reached 5.8%, its highest level since 2021, while the underemployment rate soared above 40%. Even more telling: Anthropic CEO Dario Amodei predicts AI could cut U.S. entry-level jobs by half within five years.
The Problem We're Not Talking About
When I ran my 50-person marketing agency across New York, London, and Poland, we had a predictable hiring pattern. Every summer: 2-6 interns. Most of our full-time team? Recent graduates with 0-2 years experience. The math worked because there were always those perfectly-scoped tasks—fuzzy enough to need human intelligence, simple enough for someone learning the ropes.
We had to be super careful with who we let launch campaigns in the Facebook ad accounts of our clients, so new employees wouldn’t have ‘posting privileges’ right away – they would have to earn the trust of their managers, who would publish new ads in the account for them. With AI I have to imagine that incubation period will be even longer, because how much harder is it to get good enough ad ideas that they beat the ones AI can do? Does the manager just short-circuit this loop and use AI themselves to think of new ideas to test in the account, rather than hiring the intern or new graduate in the first place?
Fast forward to today. I'm running Rally, an AI company that spins up synthetic personas to roleplay as customers. We're a lean three-person team, and here's the kicker: there's literally no task I'd hire an intern for anymore.
Why? If I'm going to invest the time to write detailed context for a task, I might as well write a prompt. The AI executes it flawlessly, doesn't need coffee breaks, and can repeat it a thousand times without fatigue.
The Da Vinci Surgery Parallel
This isn't unprecedented. The medical field faced something similar with robotic surgery systems. The Da Vinci machine became so precise (you've probably seen those viral videos of it operating on grapes) that senior surgeons could handle entire surgical loads without physical exhaustion.
The unintended consequence? Junior doctors lost crucial hands-on experience. Research from IEEE Spectrum found that surgical trainees often became "optional assistants in the OR," beginning their careers without adequate hands-on experience. As one study noted: "While the lead surgeon could theoretically give the trainee one of the robot arms to control, in practice it never happens."
The pattern is clear: when technology becomes sophisticated enough, senior practitioners monopolize it, inadvertently blocking traditional learning pathways for newcomers.
We're seeing the same pattern across every industry with AI integration. The World Economic Forum's 2025 Future of Jobs Report reveals that 40% of employers expect to reduce their workforce where AI can automate tasks, with technology projected to be the most disruptive force in the labor market.
The Simulation Solution: Fighting Fire with Fire
Here's where it gets interesting. AI created this problem—but maybe AI can also solve it.
The inspiration comes from an unexpected source: Ender's Game. If you haven't read Orson Scott Card's novel, the central premise follows brilliant children recruited into a military academy where they train through increasingly complex space battle simulations. The kids think they're playing elaborate war games—commanding fleets, making tactical decisions, sacrificing units to achieve strategic objectives.
The simulations are so realistic that the cadets develop genuine strategic thinking. They learn to manage resources under pressure, coordinate with teammates, and adapt to unexpected scenarios. Ender, the protagonist, becomes particularly skilled at these games, eventually advancing to what he believes is the final exam: a seemingly impossible battle against an alien home world.
In this climactic simulation, Ender makes a desperate choice. Facing overwhelming odds, he sacrifices nearly his entire fleet to deliver a single devastating weapon to the enemy's home planet. The strategy works—the weapon triggers a chain reaction that destroys the alien civilization entirely. Ender and his team celebrate their victory in what they assume was the most challenging simulation yet.
Then the adults in the room start crying.
*spoiler alert* The punchline hits like a sledgehammer: it wasn't a simulation. The battles were real. The "final exam" was actually the decisive battle in an ongoing war. Ender had just committed xenocide, wiping out an entire species while thinking he was playing a game. The simulation was so convincing that the kids couldn't distinguish it from reality—which is precisely why their performance was so effective.
The ethical implications of Card's story are deeply disturbing—the manipulation of children, the deception that enabled genocide, the psychological trauma inflicted on the protagonists. But beneath this dystopian narrative lies a profound insight about learning and performance psychology. When people believe they're operating in a realistic environment with genuine stakes, they develop authentic capabilities. The psychological realism of the experience, not its actual reality, drives meaningful skill development.
This principle has been validated across multiple domains, from aviation training to medical education. The key insight isn't about deception—it's about creating environments realistic enough that learners engage their full capabilities rather than holding back due to artificial constraints.
What if we could harness this principle for business training, but ethically? Create simulations transparent about their nature yet realistic enough to drive genuine learning?
Research suggests this approach works remarkably well. Studies of simulation-based training show that 82.5% of participants found scenarios realistic enough to transfer skills directly to real-world situations. More importantly, 77.8% felt emotionally prepared for complex challenges, while 80.6% gained genuine confidence in their abilities. The correlation between simulation experience and job readiness measures at 0.614—a strong relationship that suggests meaningful skill development.
Building the Business Battle School
The technical framework already exists. Large language models can roleplay convincingly as customers, colleagues, and stakeholders. We can script realistic business scenarios with genuine consequences within the simulation environment. Performance tracking systems can measure decision-making quality, teamwork effectiveness, and strategic thinking development.
Imagine a recent graduate running Facebook ad campaigns in this simulated environment. They'd receive authentic feedback from AI personas representing different customer segments. Budget allocation decisions would have realistic consequences. A/B testing would generate believable performance data. Creative choices would be met with genuine customer reactions. The entire experience would feel like managing real campaigns—because from the graduate's perspective, they essentially are.
Running a Facebook ad A/B test in Ask Rally.
Now instead of giving busywork to an intern or recent graduate, I can give them access to a simulator that lets them get better at their job. They get to practice on real world tasks with the promise that as they progress through the ‘game’ they earn my trust as a manager and will get more hands-on experience in the real world. I can see at a glance which of my team are struggling the most based on their scores in the simulation, so I can intervene to help them get back on track. Rather than the dystopian promise of Ender’s Game, the same technology can be used for good.
The irony is almost too perfect: AI disrupted traditional career ladders, but AI-powered simulation might create the most equitable hiring system we've ever had. Instead of a world where getting started requires the right connections or expensive credentials, we could have a system where talent rises based on what people can actually accomplish.
It won't be perfect immediately, but it's promising enough to pursue aggressively. The alternative—watching an entire generation get locked out of entry-level opportunities—isn't exactly appealing either.