The conversation about artificial intelligence in education has been dominated by fear, regulation, and endless policy debates. But Nate B Jones makes a simpler argument: we're living through the most transformative educational shift since electronic calculators in the 1970s — and most parents are unprepared.
Jones is a parent of three children who works in AI every day. His piece is part manifesto, part personal reflection, and part deep analysis of where education stands today. The core insight? We need to teach our kids two things simultaneously: foundational human skills AND AI fluency. Not one or the other — both.
The Scale of What Changed
The evidence for radical change in education isn't scattered. It's overwhelming.
According to Nature, artificial general intelligence has arrived — machines Turing envisioned 75 years ago are here today. One person used Claude Code to build an entire medical school curriculum in just two weeks: 450 lectures, 16,000 figures, roughly 100 million tokens of automated work with multiple rounds of error checking, and 99% of it was flawless. Work that normally takes hundreds of faculty years to produce.
Globally, 86% of students report using AI in their learning according to the Digital Education Council. In the UK, usage surged from 66% in 2024 to 92% in 2025 — an almost complete adoption in just one year.
A Harvard study published last year found that students using AI tutors learned more than twice as much material in less time than students in traditional settings. A collaboration between MIT and Google DeepMind showed AI tutoring systems outperforming human tutors on problem-solving tasks: 66% versus roughly 60%. When you combine human teachers with AI tutoring, the knowledge transfer doubles.
Khan Academy's AI tutor, Khanmigo, went from 68,000 users to 1.4 million in just one year. An eight-year-old can build video games with Claude today by typing instructions like "make the bad guys tigers" and "make them move slower." A mother with no coding background built a personalized AI tutor for her dyslexic son using what Jones calls vibe coding — natural language, iteration, nothing else.
The tool ended up freeing the learner from the mechanical to engage with the meaningful.
This is like the calculator moment, except it's not just arithmetic. It's reading, writing, research, analysis, coding, creative work, communication, and problem solving. Every single cognitive task AI can now perform competently.
Why Math by Hand Still Matters
Jones's ten-year-old sat at the kitchen table working through long division by hand with a pencil because he asked her to do that. He's also teaching her to vibe code with Claude. These aren't contradictory positions — they're the only positions that make sense together.
The reasoning connects directly to what matters most in an AI-driven world: the quality of output is determined by the quality of human specification. One agent negotiated $4,200 off a car purchase while its owner was in a meeting. Another sent 500 unsolicited messages to friends and family — same technology, same architecture, but different human specifications.
If you have clear objectives, defined constraints, and bounded communication channels, you're in business. If you have broad access and vague boundaries, you can't specify well, and you're in trouble. That's a human skill practiced manually — something we can teach our kids.
You don't get to write a good specification for something you don't understand. You cannot evaluate an AI's output in a domain where you have no knowledge. You can't exercise good judgment — things like taste, discernment, critical thinking about work you've never engaged with deeply enough to internalize.
When Jones's daughter asks Claude to help with a math problem, he wants her to know enough to recognize when Claude is wrong. When she uses Claude for coding, he wants her to know enough to recognize good separation of concerns as an architectural principle. These are human skills first — and we leverage them with AI tools later.
What Humans Build That AI Can't
Reading physical books builds mental models that no AI can build passively. Not because AI can't explain what Moby Dick means, but because the cognitive work of reading, of struggling with the text, of rereading, of integrating the ideas is itself the learning a human brain needs. The struggle is the point.
Math by hand builds a sense of numbers you don't get any other way — an intuitive feel for magnitude, proportion, and relationships that shortcuts any bypass you can get from talking with ChatGPT about statistical distributions.
Writing by hand builds the connection between thinking and expression that typing and dictation tend to compress in ways that affect our ability to remember and our ability to comprehend.
The evidence suggests AI is great for learning. The evidence shows that if AI can help extend one-on-one tutoring principles, we should do it. But the foundation comes first — built through effort with our human brains, through learning discipline as kids, not through efficiency.
What Kids Actually Learn When They Vibe Code
Jones watched his kid vibe code websites and loved it. What he sees when kids use AI isn't intellectual laziness — it's a different kind of intellectual work that's genuinely valuable.
Last week, his daughter wanted enemies in a game she was building. She typed "add enemies." Claude added enemies that can spawn off screen, move in the wrong direction, can't be hit. It doesn't work, she said. So they talked about it. He asked her what she really wanted the enemies to do.
She thought about it and then said: "Add three enemies that spawn from the right side of the screen, move them left at about a medium speed, and make them disappear when the player touches them." Suddenly, she got the behavior she was looking for. That conversation taught her more about specification quality than any scripted lesson.
When a kid vibe codes, they're doing several things simultaneously: specifying requirements in natural language for something they want to do, decomposing a complicated vague desire into discrete tasks, and learning to iterate — test the result, see what doesn't match, refine the specification. They're not debugging code. They're debugging their own intent. And these skills transfer directly to professional software development and building products for customers.
You need to be proficient and also independent — not one or the other.
Andre Carpathy, Tesla's former head of AI and founder of Eureka Labs, built what he calls an AI-native school with a stated goal: raise young people who are proficient in the use of AI but can also exist without it. That formulation matters. You need both.
Carpathy also said something every parent and teacher needs to hear right now: "You will never be able to detect the use of AI in homework." The arms race between AI writing detection and AI writing generation was over before it started. The educational response cannot be better detection — it has to be a fundamental rethinking of what we're measuring and why.
The Defining Competence
The skill connecting foundation to AI fluency is metacognition: the ability to think about your own thinking, to know what you know, to know what you don't, and to make deliberate decisions about when to rely on yourself versus when to delegate to a tool.
Researchers increasingly call this the defining competence of the AI age. Not what you know, not what the machine knows, but your capacity to move between the two — strategically allocating your cognitive efforts, coordinating AI-assisted tasks, and evaluating results against your own understanding.
In practice, it's the difference between a kid who asks ChatGPT to write the essay and a kid who drafts the essay, uses AI to identify weak arguments, and strengthens them with her own thinking.
Bottom Line
Jones's argument is both compelling and practical: we must build foundational human skills first, then give kids AI tools. The calculator analogy holds — students learned mechanics first, understood what calculators did, could estimate whether answers were reasonable, caught errors. They had the foundation and the tool extended it.
The strongest part of this piece is the directness — no hedging about what kids need. The biggest vulnerability is that many schools aren't equipped for either side of this equation. They're still running on 20th-century industrial philosophy while two billion kids inherit a world where machines can do most things we spent the last century teaching humans to do.
What comes next? Metacognition — knowing when to delegate and when to think for yourself — is the only skill that matters more every year.