AI in EdTech: The Reckoning
The dark truth about EdTech and how AI could finally force the industry to grow up.
Here’s what nobody except Gagan Biyani and I want to say out loud: most educational apps are optimized to make money, not to actually teach you anything.
You’ve probably felt this yourself. Download Duolingo, feel great for a week, then quietly forget Spanish exists. Sign up for that online course, watch three videos, never finish. The app celebrates your “7-day streak!” while you secretly know you can’t actually do the thing you’re supposedly learning.
This isn’t your fault. It’s Duolingo’s business model.
And AI is about to make this problem much, much worse.
Or, if we’re lucky, finally fix it.
The Billion-User Illusion
EdTech companies love to brag about scale. “One billion users!” they announce, as if that number means anything.
But here’s the thing: a billion people logging in and bouncing off your app doesn’t prove you’re changing lives. It proves you’re good at acquisition marketing.
Researchers like Audrey Watters and Neil Selwyn have been shouting from the rooftops for years: reach is not the same as learning. Engagement is not the same as skill development and enablement.
You can “feel productive” while learning absolutely nothing.
Education is uniquely vulnerable to this trap because you can’t tell if it’s working while you’re doing it. Skills take hours to build and weeks to stick.
By the time you realize that online course was useless, the company already has your money, your data and proudly sharing both with investors.
AI makes this problem turbo-charged. Large language models can generate perfect explanations, solve problems instantly, and create a beautiful illusion that you’re getting smarter, while you’re actually getting more dependent.
Here’s the twist: The very same AI could finally force the whole industry to get very honest with itself, very fast.
Why You Forget Everything You “Learned”
Cognitive science has known for decades that learning isn’t just exposure to information. Daniel Willingham put it bluntly: students don’t like school because thinking is hard, and what you learn in one context almost never transfers to another.
The research is brutal:
Smooth explanations feel good but don’t stick (Robert and Elizabeth Bjork’s work on “desirable difficulties”)
Repetition alone doesn’t create expertise. You need struggle, feedback that unblocks the blockers and clears the path, as well as fixing your mistakes (Anders Ericsson on “deliberate practice”)
Ease is the enemy of mastery
From an EdTech business perspective, this is all a nightmare. Make learning hard, and users complain and quit. Make it easy, and users eventually realize they can’t actually do anything, then they quit anyway.
The fastest-growing EdTech products are often the most fragile ones. And then more fragile they are, the harder they optimize for metrics that have nothing to do with learner success.
AI doesn’t solve this problem. It exposes it in high definition.
Why You Actually Quit Learning Apps
Here’s the real reason you stopped using that language app: not because you couldn’t do it, but because your motivation died.
According to Self-Determination Theory, you stay motivated when three needs are met: autonomy (feeling in control), competence (feeling capable), and relatedness (feeling connected).
Most AI-powered learning tools accidentally destroy all three:
They kill autonomy by making too many decisions for you
They fake competence by hiding your struggles behind smooth AI assistance
They replace human connection with bland robo-feedback
Richard Thaler’s work on nudges taught us something crucial: people respond to whatever gets rewarded, whether designers intend it or not. But as BJ Fogg points out, motivation alone won’t save you: you need ability and the right triggers working in sync.
This is where Angela Duckworth’s research on grit becomes painfully relevant. Perseverance matters, but only when the goal feels genuinely meaningful. When a learning app rewards speed over mastery, or completion over application, you’ll optimize for exactly that.
The moment you realize you can’t actually speak Spanish despite your 100-day streak? You’re gone.
AI can either automate disengagement, or help users detect their progress in real time.
The Breakthrough: Making Progress Measurable and Visceral
Here’s where things get interesting.
The biggest disconnect in EdTech is this: real learning is slow and invisible, but companies need to show fast, visible wins. AI can actually bridge that gap.
Instead of being your tutor, AI can be your progress tracker, the thing that shows you’re genuinely getting better at something real.
Well-designed AI systems can:
Track actual skill growth over time, not just “lessons completed”
Generate targeted practice that hits your weak spots
Show you evidence of transfer: “You couldn’t do this last month. Now you can.”
Help you reflect on how your thinking has changed
This isn’t personalisation for the sake of personalisation. This is making learning feel tangible and real.
When you can actually see yourself gaining abilities you can use outside the app, everything changes. You stay. You pay. You tell your friends.
AI can turn learning from “trust me, it works” into “holy shit, I can feel this working for me.”
The 4-Layer Test: How to Spot Real Learning
Want to know if an AI learning tool is legit? Ask these four questions:
1. Capability Gain
What can I do now that I couldn’t do before?
If you can’t answer this concretely, it’s not learning, it’s engagement theatre.
2. Progress Legibility
Do I understand how and why I’m improving?
AI should make your growth visible, not hide it behind smooth experiences.
3. Motivation Alignment
Do the short-term rewards match long-term mastery?
If ease replaces effort, you’ll quit eventually. Guaranteed.
4. Economic Sustainability
Does better learning drive the business, or do dark patterns?
If the company makes more money when you don’t learn, run.
Only when all four align do you get a learning product that actually works, and lasts.
The Dependency Trap
Critical scholars like Ben Williamson warn that AI in education is never neutral. These platforms shape what counts as knowledge, what counts as progress, what counts as success.
If AI systems optimise for engagement metrics instead of human capability and enablement, we’re building a generation of confident, dependent learners who collapse the moment you take their AI away.
This isn’t just a moral problem. It’s a business problem. Dependency destroys trust. Low-to-no trust kills long-term value.
The EdTech companies that survive will be those that treat learning outcomes as their competitive advantage, not a marketing tagline.
In a world drowning in AI, credibility becomes the rarest and most valuable currency.
The Bottom Line
The tension between making money and actually helping people learn is real. But there’s a simple alignment point:
EdTech companies only deserve to grow if learner success makes them grow, not paid ads, not viral loops, not engagement tricks.
AI eliminates all excuses. We can now measure, model, and surface learning in ways that were impossible before.
The question isn’t whether AI will be used. It’s what we’ll optimise for.
If we optimise for making learning easier, faster, and emptier, we’ll get scale without substance: a billion users who can’t do anything.
If we optimise for making progress tangible and even feel exciting in the moment, we might finally build an EdTech industry where making money and making lives better reinforce each other.
AI isn’t making learning more lucrative. It’s making EdTech accountable.
And it’s about damn time.
Kate Busby is Adjunct Professor of AI Marketing at ESEI International Business School, CoFounder of Quiet Edge and Fractional CMO and Board Advisor to edtech startups from Pre-Seed to Series B. Catch her on X and LinkedIn. To receive more articles exploring AI in EdTech, subscribe to Substack.



