The Most Important AI Lesson Must Be Taught at Home, Why Parents Hold the Key to Responsible AI Use in Education
On April 1, the Union Education Minister launched the new CBSE curriculum on computational thinking and artificial intelligence for students of classes III to VII. The rollout is among the most ambitious school AI programmes anywhere in the world. Indian children will now learn how AI works, what its underlying algorithms are, and how to build basic models. This is a commendable step. But there is no plan, however, for how those children will use AI. Consider the chatbot on a child’s phone at nine in the evening, sitting beside a half-finished assignment. The syllabus does not reach that moment. No curriculum can teach judgment. That lesson belongs to the home.
A recent study by the Salaam Bombay Foundation and NMIMS surveyed 1,050 Class IX students across 20 Mumbai municipal schools. More than 70 per cent reported using ChatGPT, mostly for maths problems, translations, and homework. The study also found early signs of cognitive offloading: letting a tool do our mental tasks. We all do this; the worry is when a child does it before learning how to think. A calculator is a tool. A spell-check is a tool. But these tools assist after the foundational skill is learned. AI chatbots can replace the foundational skill itself. A child who asks ChatGPT to solve a maths problem may never learn to solve it on their own. A child who asks ChatGPT to write an essay may never learn to structure an argument.
In a 2025 study, researchers from the University of Pennsylvania’s Wharton School tracked nearly 1,000 Turkish high-school maths students who were given AI tools during practice. They found that those given unrestricted AI access scored 48 per cent better than peers who studied without it. But when AI was withdrawn, those who had used it did 17 per cent worse than peers who had never used it. The AI acted as a crutch. While the crutch was present, the students performed better. When the crutch was removed, they performed worse than those who had never used it. The students had not learned the underlying mathematics; they had learned to use the AI.
The case against AI in education is narrower than these numbers suggest. A college student of mine struggles with English. He uses AI to re-explain the class notes in the language he understands best. For him, the AI is not a crutch; it is a bridge. He learns the material. He does not outsource his thinking. Which way it goes depends on the student’s judgment. No policy can teach that. Whether a child reaches for the chatbot to understand or to outsource depends on what the school demands and what the household rewards. The former is outside a parent’s control. The latter is not.
If the expectation at home is highest-marks-in-every-subject, an overloaded child will likely search for shortcuts. Schools that reward high marks share the blame. But parents have a larger stake in how their children turn out, and must move first. What does that mean in practice? Take the child into confidence. The cost of letting the chatbot do their thinking is real, and most children can absorb this if an adult explains it. They would love to think for themselves; they outsource because, in the moment, the alternative is harder. A parent who sits with a child and works through a problem together is teaching more than maths. They are teaching persistence, patience, and the value of effort.
Recalibrate. When your child reports a mark, ask how they got it. If the work was their own, praise it and let the mark be whatever it is. If the work was the chatbot’s, say so, and withhold praise. The mark becomes secondary to the method. This is a radical shift in many households, where the final number has long been the only thing that matters. But the final number tells you nothing about whether the child learned anything. A child who gets a 90 with the chatbot has learned less than a child who gets a 70 on their own.
None of this works without your own AI literacy. The point is not catching the chatbot in your child’s homework but knowing these tools well enough to ask sharp questions about their process. A parent who has never used ChatGPT cannot meaningfully discuss its role in education. A parent who has used it, who understands its capabilities and its limitations, can have a real conversation. This does not require becoming an expert. It requires fifteen minutes of experimentation.
The new curriculum will teach Indian children how AI works. Knowing when to set the chatbot aside is as essential as knowing how to use it. That lesson is given at home, in a hundred small conversations about what learning is, and what is beneficial in the long run. The parent who asks the child, after every assignment, to explain the work back, or try it once with the screen closed, is giving them something AI cannot. The slow muscle of working things out, and the joy of arriving there themselves.
The school curriculum is ambitious. It is also incomplete. It teaches the mechanics of AI, but not the ethics of its use. It teaches the how, but not the when. The when is a judgment call. It depends on the child, the subject, the difficulty of the task, and the goal of the assignment. A chatbot used to generate ideas for a creative writing assignment is different from a chatbot used to write the assignment itself. A chatbot used to check the grammar of a completed essay is different from a chatbot used to generate the essay from a prompt. The distinctions are subtle, but they matter. They are also beyond the scope of any syllabus.
The most important AI lesson will not be taught in a classroom. It will be taught at the dining table, in the car, or while helping with homework. It will be taught by a parent who asks, “What did you learn today?” and means it. The government has done its part. The schools are doing theirs. Now it is the turn of parents. The stakes could not be higher. The children who learn to use AI as a tool, not a replacement, will thrive. The children who learn to outsource their thinking will struggle. The difference between them will be the conversations they have at home.
Questions and Answers
Q1: What did the Wharton School study of Turkish high school maths students reveal about the impact of unrestricted AI access?
A1: The study found that students with unrestricted AI access scored 48 per cent better than peers who studied without it during practice. However, when AI was withdrawn, those who had used it did 17 per cent worse than peers who had never used it, indicating that AI acted as a crutch preventing actual learning.
Q2: What is “cognitive offloading,” and why is it particularly concerning for children?
A2: Cognitive offloading is letting a tool do our mental tasks. It is concerning for children because it happens before they have learned how to think for themselves. A child who uses AI to solve maths problems may never learn to solve them independently.
Q3: What does the article say determines whether a child uses AI to understand or to outsource?
A3: Whether a child reaches for the chatbot to understand or to outsource depends on what the school demands and what the household rewards. The school demands are outside a parent’s control, but household rewards are not.
Q4: What practical steps does the article recommend for parents to guide their children’s AI use?
A4: The article recommends: taking the child into confidence by explaining the cost of outsourcing thinking; recalibrating by asking how a mark was achieved rather than focusing on the mark itself; withholding praise for chatbot-generated work; and developing one’s own AI literacy to ask sharp questions about the child’s process.
Q5: What does the article identify as the “most important AI lesson” and where should it be taught?
A5: The most important AI lesson is knowing when to set the chatbot aside, which is as essential as knowing how to use it. This lesson should be taught at home, through conversations about what learning is and what is beneficial in the long run, rather than in a classroom curriculum.
