Teachers report that AI can also help improve student writing, as long as it is used to support students’ efforts rather than do the work for them: “Teachers report that AI can ‘spark creativity’ and help students overcome writer’s block.” … At the drafting stage, it can help with organization, coherence, syntax, semantics and grammar. At the revision stage, AI can support editing and rewriting ideas as well as help with… punctuation, capitalization and grammar.”
But if there’s a refrain in the report, it’s this: AI is most useful when it complements, not replaces, the efforts of a flesh-and-blood teacher.
Cons: AI poses a serious threat to students’ cognitive development
At the top of Brookings’ list of risks is the negative effect AI could have on children’s cognitive growth—how they learn new skills and perceive and solve problems.
The report describes a kind of vicious cycle of AI dependence, in which students increasingly offload their own thinking onto the technology, leading to the kind of cognitive decline, or atrophy, more commonly associated with aging brains.
Rebecca Winthrop, one of the report’s authors and a senior fellow at Brookings, warned: “When kids use a generative AI that tells them what the answer is … they’re not thinking for themselves. They’re not learning to parse truth from fiction. They’re not learning to understand what makes an argument good. They’re not learning about different perspectives in the world because they’re not really engaging with the material.“
Cognitive overload is nothing new. The report states that keyboards and computers reduce the need for handwriting, and calculators automate basic math. But AI is turbocharging that kind of offloading, especially in schools where learning can feel transactional.
As one student told the researchers, “It’s easy. You don’t have to (use) your brain.”
The report offers ample evidence to suggest that students using generative AI are already seeing declines in content knowledge, critical thinking, and even creativity. And this can have huge consequences if these young people grow up to be adults without learning to think critically.
Pro: AI can make teachers’ jobs a little easier
The report says another benefit of AI is that it allows teachers to automate some tasks: “generating parent emails … translating materials, creating worksheets, rubrics, tests and lesson plans” — and more.
The report cites multiple studies that found significant time-saving benefits for teachers, including one US study that found that teachers using AI saved an average of nearly six hours per week and about six weeks over the course of a full school year.
Pros/Cons: AI can be an engine of justice or injustice
One of the strongest arguments in favor of the educational use of AI, according to the Brookings report, is its ability to reach children who have been excluded from the classroom. The researchers cite Afghanistan, where girls and women have been denied access to formal education beyond primary school by the Taliban.
According to the report, a program for Afghan girls “uses AI to digitize the Afghan curriculum, create lessons based on that curriculum and distribute content in Dari, Pashto and English through WhatsApp lessons.”
AI can also help make classrooms more accessible to students with a wide range of learning disabilities, including dyslexia.
But “AI could greatly increase existing divisions,” Winthrop warns. That’s because the free AI tools most accessible to students and schools can also be the least reliable and least factually accurate.
“We know that wealthier communities and schools will be able to afford more advanced AI models,” says Winthrop, “and we know that those more advanced AI models are more accurate. Which means that this is the first time in the history of electronic technology that schools will have to pay more for more accurate information. And that really hurts schools without a lot of resources.”
Cons: AI poses serious threats to social and emotional development
Survey responses revealed deep concerns that the use of AI, especially chatbots, “undermines students’ emotional well-being, including their ability to form relationships, bounce back from setbacks and maintain their mental health,” the report said.
One of the many problems with children’s overuse of AI is that the technology is inherently subliminal—it’s designed to reinforce users’ beliefs.
Winthrop says that if children are building social-emotional skills largely through interactions with chatbots that are designed to agree with them, “it becomes very uncomfortable to be in an environment where someone disagrees with you.”
Winthrop offers the example of a child interacting with a chatbot, “complaining about your parents and saying, ‘They want me to do the dishes—it’s so annoying.’ I hate my parents. The chatbot will probably say, “You’re right. You’re misunderstood. I’m so sorry. I understand you. Versus a friend who would say, “Dude, I do the dishes all the time at my house. I don’t know what you’re complaining about. That’s normal. That’s where the problem is.”
A recent study from the Center for Democracy and Technology, a nonprofit that advocates for civil rights and civil liberties in the digital age, found that nearly 1 in 5 high school students say they or someone they know has had a romantic relationship with an artificial intelligence. And 42% of students in this survey said they or someone they know has used AI to communicate.
The report warns that AI’s echo chamber can hinder a child’s emotional growth: “We learn empathy not when we are fully understood, but when we misunderstand and recover,” said one of the experts surveyed.
What to do about it
The Brookings report offers a long list of recommendations to help parents, teachers and policymakers—not to mention tech companies themselves—harness the good of AI without exposing children to the risks the technology currently poses. Among these recommendations:
- The learning itself could be less focused on what the report calls “completing a transactional task” or an assessment-based end game, and more focused on fostering curiosity and a desire to learn. Students will be less inclined to ask AI to do the work for them if they feel engaged with that work.
- AI designed for use by children and teenagers should be less condescending and more “antagonistic,” challenging preconceptions and challenging users to reason and evaluate.
- Tech companies could collaborate with educators in “co-design centers.” In the Netherlands, a government-backed center is already bringing together technology companies and educators to develop, test and evaluate new AI applications in the classroom.
- Holistic AI literacy is critical—for teachers and students alike. Some countries, including China and Estonia, have comprehensive national guidelines for AI literacy.
- As schools continue to embrace AI, it is important that underfunded areas in marginalized communities are not left behind, allowing AI to further drive inequality.
- Governments have a responsibility to regulate the use of AI in schools, ensuring that the technology used protects students’ cognitive and emotional health, as well as their privacy. In the US, the Trump administration has tried to ban states to regulate AI themselves, although Congress has so far failed to create a federal regulatory framework.
