5 strategies for mentoring junior developers in the AI era
The real risk of AI isn't bad code, it's weak thinking. Here's how to fix that.
Note: This post is for those of us who lead and manage engineers. But if you’re a junior or aspiring developer, keep reading: the takeaways you'll learn are crucial for your long-term success.
AI tools are fantastic. They boost productivity, help teams move faster, and can even prevent burnout.
But a 2025 Microsoft report, The Impact of Generative AI on Critical Thinking, highlighted a growing risk we can't ignore. The study assessed knowledge workers and found that:
Daily AI users self-reported a reduced need for critical thinking.
Higher trust in AI correlated with lower engagement in problem-solving.
This is especially risky for junior engineers still building foundational skills.
If we're not careful, overreliance on AI can erode the critical thinking that turns juniors into senior engineers.
And without strong thinking, they can't effectively supervise what AI produces.
The scary part? Most junior devs don’t even realize when they’re falling into this trap.
That's why strong mentorship matters more than ever.
Today, I'll share 5 battle-tested strategies I use to mentor junior engineers in the AI era. (And if you’re a junior developer yourself, you’ll walk away with tools you can bring to your next 1:1).
Let's go.
5 battle-tested strategies for mentoring junior engineers in the AI era
1. Encourage healthy skepticism of AI output
AI tools hallucinate. Sometimes in subtle ways that pass reviews. Other times, in ways that break production.
My friend recently saw a junior blindly copy AI-generated code for payment handling. It looked clean and passed tests, but buried deep inside was a logic bug that allowed duplicate refunds. Luckily, they caught it in review (barely).
So your goal as a mentor is to build discernment, not dependence.
Teach juniors to question AI suggestions by default. Ask them to cross-check AI answers against official documentation and system constraints.
💡 Tip: In PR reviews, have developers annotate AI-generated code with explanations of why they trust it (or what they changed).
2. Assign debugging without AI
Back at Facebook, I’d hand juniors a broken backend service and say: “Fix it. No Google. No Stack Overflow.”
The first few times, it stumped them.
But by the third or fourth attempt? They were flying.
There’s a concept in education called productive struggle. It's the idea that learning happens when students wrestle with problems just beyond their current ability.
Debugging is that productive struggle for engineers.
AI gives answers in seconds. But real confidence comes from staring at logs, tracing bugs, and surviving the chaos without shortcuts. If AI gives them the answer in two seconds, they miss the struggle that builds their debugging intuition.
💡 Tip: Set aside debugging exercises where they have to read logs, use printf, and stare into the abyss of a stack trace. No pain, no gain.
3. Turn code reviews into discussions (not checklists)
A while ago, I was reviewing a PR from a new hire. I asked: “If this API starts getting 100x more traffic, what breaks first?”
He paused. Then said, “Honestly, I have no idea.”
That turned into a 30-minute chat about caching, rate limits, and circuit breakers. Way more valuable than nitpicking line length.
If your reviews are just “add a test” or “fix spacing,” you’re missing the chance to spark critical thinking. Not just about the code today, but about how to design systems that survive tomorrow.
Push juniors to explain their decisions. Ask about edge cases. Challenge alternatives. Make it about how they think, not just what they wrote.
💡 Tip: Try this prompt: “What assumptions does this code make about its inputs. What happens if those assumptions are wrong?”
4. Introduce system thinking earlier than feels comfortable
AI automates grunt work, which means juniors hit “done” faster.
Great. Now what?
As a mentor, this is your chance to teach them to think about architecture, failure modes, and scale.
System thinking used to be something engineers learned later. Now, with AI handling more low-level matters, juniors actually get to advance to high-level topics like System Design earlier.
You can start with small design problems like:
“How would you build a feature flag system that can scale 10x?”
“What would break if we suddenly got double the traffic?”
💡 Tip: Review their thought process, not just the diagram.
5. Simulate outages and scale failures
Everything breaks in production, and the only way to prepare is practice.
How do you think Netflix boasts 99.99% uptime? By exposing engineers to failure. Their Chaos Monkey tool randomly shuts down production servers, effectively helping them build a resilient system.
So, pick a day each month to unplug your most critical server.
…Just kidding.
But you should be doing something of the like.
You can try chaos engineering tactics like:
Simulating full or partial outages
Forcing edge-case bugs into the system
Injecting sudden traffic spikes or resource exhaustion
💡 Tip: Start in staging. Use tools like Gremlin or simple failure scripts to introduce latency or kill services.
Growing coders into problem-solvers
AI makes juniors faster, but only guardrails make them wiser.
If you coach them to ask why, think in systems, debug without a crutch, and reason through failure, you’ll build engineers who ship resilient, scalable, production-ready systems.
The best E5s I interviewed weren’t the ones who solved the hardest problems. They were the ones who explained why they made each design choice.
That’s who you’re building. And honestly — isn’t that the teammate you want next to you when the pager goes off?
You can find more tips and frameworks to be an effective mentor in Educative's course: Become an Effective Software Engineering Manager.
You may also want to check out Educative's Generative AI resources to learn how to get hands-on with AI responsibly.
Happy learning!
— Fahim