American Compass Comment on Department of Education’s Proposed Priority and Definitions “Secretary's Supplemental Priority and Definitions on Advancing Artificial Intelligence in Education”

Download PDF
Recommended Reading
White collar workers displaced by AI could spark a revolution
Is Our Children Learning? Apparently Not.
America On-the-Line

Given the significant challenges facing the American education system, and the rapid changes artificial intelligence (AI) is bringing to our workforce and society, the Department of Education is right to think critically about the role of AI in equipping today’s students to be tomorrow’s leaders. If we get this technological transition right, it could help secure American global leadership in coming decades; if we get it wrong, however, it set our students back and leave them unprepared to thrive in our AI-powered future. The Department of Education, accordingly, must insist upon effective and age-appropriate design guided by rigorous research and chastened by the lessons of past EdTech failures, where screen-based learning has too often fostered isolation, distraction, and intellectual laziness. Drawing on these lessons and input from researchers, parents, and educators, the Department should develop a detailed code of best practices to govern its grantmaking in this domain.

In its proposed priority, the Department identifies two distinct goals. The first is to foster greater “AI literacy,” equipping students to understand and effectively use AI tools and systems, so that students enter adulthood well-prepared for an increasingly AI-driven world. The second is to use AI as a means to improve educational outcomes across the board, for a broad range of subjects.

Until now, these two goals have been largely at odds with each other: American students, it seems, understand how to use AI tools all too well, frequently much better than their parents and teachers, whom they regularly dupe by producing AI-generated assignments and passing them off as their own work. Teachers from junior high on up have reported epidemic levels of AI-enabled cheating, and studies so far show that use of chatbots for academic work tends to undermine reasoning, retention, and writing skills.1Natalia Kosmyna, et al., “Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task.” arXiv preprint, 2025 (https://arxiv.org/pdf/2506.08872v1). This outcome is surely not inevitable. These technologies are in their early stages, and can no doubt be improved and crafted to better meet educational objectives. Teachers, administrators, parents, and students can all be taught how to harness these powerful tools rather than simply being caught up and carried along by them.

However, the tendency of AI to undermine rather than enhance learning thus far should caution us as to the magnitude of the challenge that lies before us if students are “to move from passive users of AI to active creators and innovators.” While many of the goals in the Department of Education’s proposed priority are laudable, much more work needs to be done on the front end if these goals are to be met. The proposed priority highlights the need to:

“(ix) Support dissemination of appropriate methods of integrating AI into education.

(x) Build evidence of appropriate methods of integrating AI into education.”

We respectfully urge that not only should (x) come before (ix), but both should be prioritized ahead of the preceding eight points, which call for various forms of “expanding” AI offerings. In order to build evidence of how best to integrate AI into education, we suggest that the Department should (1) collate and assess existing research on educational impacts of AI and digital tools more broadly, (2) convene an advisory body of parents, educators, researchers, and technologists to help evaluate this research, and (3) establish pilot programs with measurable benchmarks and mandatory reporting to inform future expansion. Only thus will we be in a position to reap the benefits of AI without its collateral harms.

How Can We Educate for AI Literacy?

The Department’s first goal, to equip today’s youth for tomorrow’s jobs in an increasingly AI-driven world, is an essential one. This goes beyond mere career-worthiness to what it means to raise the next generation to be effective citizens. We must teach them to understand artificial intelligence systems, their pros and cons, when they can be relied upon, and how they can be abused. Our schools have a key role to play in this formation. But it is critical to note that learning about AI is not the same as learning with AI. While there are practical skills for using AI effectively that require hands-on training, that must be accompanied by understanding of how large language models work, how they are trained, and what sorts of problems they excel at. The advent of AI calls not for a doubling down on mere technical training, but the revival of both the humanities (equipping students to engage with questions in philosophy of language, philosophy of mind, and ethics) and technology studies (so they understand how AI systems work).

If the goal of the Department is to prepare students for an AI-transformed workforce, it must note that many of the sharpest prognosticators of our AI future anticipate that this workforce will be one that values not computer skills, but human skills. It is widely anticipated that some of the first jobs replaced by AI will be coding jobs and others that involve humans manipulating data on computers. for instance.2See Abigail Ball, “White collar workers displaced by AI could start a revolution,” New York Post, July 14, 2025 (https://nypost.com/2025/07/14/opinion/white-collar-workers-displaced-by-ai-could-spark-a-revolution/) As AI becomes considerably more efficient than humans at such on-screen tasks, the jobs of the future will increasingly select for and reward off-screen skills: managerial skills, relational skills, physical competencies in areas AI has not yet penetrated, and understanding of human nature and psychology. These are exactly the skills which are in danger of atrophying if we move to maximize students’ time on devices interacting with AI systems. Moreover, while it is important for students to understand the broad principles of how AI works, we should be mindful that given the rapid advances in AI technology, the specific systems and process in common use today will likely be radically transformed within a decade. Thus we should beware of investing too much time in teaching fifth-graders how to use today’s AI tools in the hope that this will improve their on-the-job AI competence when they graduate college.

Effective AI literacy education, then, would begin with foundational instruction in the deeper questions of how to understand machine learning, along with the basics of computer science, before proceeding to hands-on practice in using AI tools for a variety of applications in the upper grades. We would encourage the Department to consider the model of the high-school computer lab of the 1990s as an effective context for introducing students to AI systems, without allowing them to colonize all grades or the entire school day. 

Can AI Improve Educational Outcomes?

The Department’s proposed priority seems to envision that colonization of all subjects by AI tools may produce improved educational outcomes across the board. Resisting this temptation will be hard. It is no secret that the American public education system is broken. In a technological age, it is natural to seek a technological fix, and what technology better or more powerful than AI? AI promises to ease the burdens on overstrained teachers, allowing them to share the work of teaching and grading with artificial tutors, to tailor instruction and assignments to the needs of particular students, and to help them assess outcomes and modify lesson plans to optimize outcomes. AI also promises to re-engage distracted and struggling students, working patiently with them to provide challenge and encouragement until they master needed concepts.

But not all problems are technical problems, capable of technical solutions. At its root, the task of education is about forming human beings for mature judgment as citizens, competent mastery of problem-solving skills, and healthy social engagement with other people. Such human skills require human teachers, and technology can at best provide a supporting role. The inversion of this relationship over the past decade, as screens have largely taken over the American classroom, has coincided with declining learning outcomes, and studies suggest that in this case at least, correlation does equal causation.3See for instance Jared Cooney Horvath, “The EdTech Revolution Has Failed,” After Babel, November 12, 2024 (https://www.afterbabel.com/p/the-edtech-revolution-has-failed); OECD, “Students, Computers, and Learning: Making the Connection,” OECD Publishing, 2015 (https://www.oecd.org/content/dam/oecd/en/publications/reports/2015/09/students-computers-and-learning_g1g57f3a/9789264239555-en.pdf).  Bad technology can sometimes be remedied by better technology, but only if we carefully grapple with and learn from past mistakes.

Recommendations

The Department should develop and disseminate a code of best practices and standards that governs any grantmaking related to the adoption of AI tools and systems in our schools. This code should take into account what we have learned from past deployments of EdTech: that negative learning outcomes from such deployments stem from at least three sources: isolation, distraction, and laziness. Accordingly, the Department’s guidance for AI educational programs should encourage the development and deployment of AI tools that foster collaboration, limit time-on-device, and relentlessly prioritize skill development over task completion.

First, students learn much more effectively when they learn together. Collaborative problem-solving engages all of our faculties and prepares students effectively for the challenges they are likely to face in the adult workforce. Screens tend to isolate students from one another and the teacher, physically and attentionally, diminishing motivation to learn and causing social skills to atrophy. We urge the Department of Education to encourage the deployment of AI tools and educational practices that foster collaboration rather than isolation.

Second, the normalization of smartphones and Chromebooks, with their myriad distractions, has fragmented attention and undermined students’ capacity to listen, learn, reflect, and patiently problem-solve. Studies suggest that students are off-task between 29% and 63% of the time when using computers in class, depending on the grade and instructional context. 4Karrie E. Godwin, et al., “Classroom activities and off-task behavior in elementary school children,” Learning and Instruction, 44 (2016), 128–143 (https://doi.org/10.1016/j.learninstruc.2016.04.003); Eric D. Ragan, et al., “Unregulated use of laptops over time in large lecture classes,” Computers and Education,78 (Sept 2014), 78-86 (https://doi.org/10.1016/j.compedu.2014.05.002). With human teachers no longer able to command their students’ wandering attention, it is tempting to turn to hyper-engaging AI interfaces to keep students learning, but this is a bit like treating cocaine withdrawal by administering heroin. Such students will be unfit for the workplace, which requires the ability to tune out distractions and listen to a colleague’s presentation during a meeting, for instance. We urge the Department of Education to encourage the deployment of AI tools and educational practices that limit time-on-device and restrict students to specific applications.

Third, the sheer ease of asking a computer to solve problems or answer questions can undermine students’ ability to develop critical intellectual skills and disciplines. Deployed judiciously, calculators may save time in basic computation to allow students to move on to higher math skills, and word processors can take some of the grunt work out of writing to enable students to focus on organizing and articulating ideas. Beyond a certain point, however, the reflexive reliance on computers to do any task that might require effort will breed students incapable of tackling difficult problems and developing key academic skills. All the evidence thus far suggests that AI, by vastly expanding the number of tasks which students can offload onto a computer, is weakening students’ writing, reasoning, and problem-solving skills.5Michael Gerlich, “AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking,” Societies, 15.1 (2025): 6 (https://doi.org/10.3390/soc15010006). We urge the Department of Education to encourage the deployment of AI tools and educational practices that help students to develop skills rather than simply completing tasks for them.

Finally, while a careless deployment of AI is likely to exacerbate negative trends produced by too many screens in classroom, we must also stress that AI poses unique new challenges against which the Department of Education must be on guard. Many schools are dealing with epidemics of deepfake pornography, in which students use AI to generate nonconsensual sexual imagery of classmates.6Natasha Singer, “Teen Girls Confront an Epidemic of Deepfake Nudes in Schools,” New York Times, April 11, 2024 (https://www.nytimes.com/2024/04/08/technology/deepfake-ai-nudes-westfield-high-school.html). A growing body of research suggests that generative AI chatbots are prone to create unhealthy and dangerous emotional dependencies in users, especially children.7Common Sense Media, “Social AI Companions,” July 16, 2025 (https://www.commonsensemedia.org/ai-ratings/social-ai-companions). While it is certainly possible to develop AI tools that guard against such harmful uses, few companies have properly invested in such guardrails to date. Before encouraging the accelerated adoption of AI systems in schools, the Department of Education should insist upon rigorous standards of age-appropriate design that protect children against the dangers of these powerful tools.

The Department is right to encourage AI literacy in our schools, and to envision a future where AI tools can improve teacher training and educational outcomes. These benefits, however, will only be realized if accompanied with thorough research and thoughtful design. The more powerful the tools we use, the more is at stake; if AI is the most powerful tool we have yet produced, we owe it to our fifty million American schoolchildren to design and deploy it wisely.


Brad Littlejohn
Brad Littlejohn is Director of Programs and Education at American Compass. He also serves as a Board Member and Coalitions Advisor for the Digital Childhood Institute. You can follow him on X at @WBLittlejohn.
@WBLittlejohn
Recommended Reading
White collar workers displaced by AI could spark a revolution

Earlier this month, Microsoft announced plans to lay off 9,000 employees, joining the ranks of tech companies cutting headcounts while touting the productivity gains enabled by artificial intelligence. According to CEO Read more…

Is Our Children Learning? Apparently Not.

Plus, Sam Altman as ShamWow Guy, and more from this week…

America On-the-Line

The digital era’s addictive technologies are undermining the foundations of our self-governing republic