Science & Tech

‘Deskilling’ is bad. This is worse.

Classroom.
4 min read

Authors of book about AI in K-12 classrooms say loss of foundational knowledge is biggest threat

Educators should teach students how to use AI tools but with an emphasis on the ethics, social impact, and potential biases of the tech, experts said Thursday during a conversation sponsored by Harvard Education Press. 

Stephanie Smith Budhai and Marie Heath, who co-authored “Critical AI in K–12 Classrooms,” told Teddy Svoronos, senior lecturer of public policy at the Kennedy School, that responsible use of AI requires a healthy dose of skepticism. In other words: Resist the hype by asking hard questions. 

“Does this really align with our visions of education?” said Heath, associate professor of learning design and technology at Loyola University Maryland. “Does this serve communities, as opposed to the folks who are developing this technology and telling us it’s going to be transformative?”

Budhai, associate professor of educational technology at the University of Delaware, said that teacher education programs should include training on how to help students examine the effects of AI inside and outside the classroom, including its environmental impact. A sort of critical AI literacy is needed, she said. 

“We’re not saying we have to be anti-tech,” said Budhai. “We’re saying: Let’s think about the bigger questions. … Students need to build a critical consciousness around the ways we interact with AI and understand how it works.” She added: “They need to really understand the harms of it.”

“For people who train teachers to use technology, it’s really important to have a framing where anytime you’re using technology, it’s for a purpose.”

Stephanie Smith Budhai

Educators are concerned about students’ over-reliance on AI and its possible impact on critical thinking, problem-solving, and relationships, the authors noted. The threat is not just to skills students have already developed but might lose as they outsource essays and other assignments to machines, they said. It runs much deeper. 

“Students don’t know how to write a topic sentence because they’re asking AI for the topic sentence,” said Budhai. “They’re ‘never-skilling,’ which is even scarier than ‘de-skilling,’ which is losing the skills they had because they’re over-relying on AI. Never-skilling means they’ve never learned the skill because they are using AI for everything, so they don’t even have foundational skills.”

Heath, a former high school social studies teacher, worries about the impact of AI on social interactions and civic life. 

“I think about the ways that these technologies, particularly generative AI, allow us to be frictionless in our activities, and it sort of reduces the need for human interaction,” she said.

“For democracy to function, we need to be able to sit in discomfort, and we need to know what it feels like to disagree and to be disagreed with. One of the things that we give up when we turn to this technology is the ability to sit in discomfort and practice being uncomfortable.” 

The authors also zeroed in on the problem of biases, explicit and implicit, in AI tools. In researching “Critical AI in K–12 Classrooms,” they asked AI for book recommendations for Black and white high school students, and they found that the lists and even the feedback had implicit biases, with the books for Black students disproportionately about crime and poverty.

In a separate research project, Heath detected biases when AI provided feedback on students’ written work.

“AI is laden with all the biases of society,” she said. “If it perceives that the student is either from a higher socio-economic class or white, the feedback it gives is very conversational in tone, like, ‘Have you thought about XYZ?’ If AI perceives that the student is either socio-economically disadvantaged or is a Black or brown student, it uses a very direct, authoritative tone.” 

The message from the tool, Heath said, is, “‘I know what’s right’ and ‘You should do this this way.’”

Sharing takeaways from their findings, Budhai and Heath urged educators to pause over a simple question — why? — before deploying AI the classroom.

“For people who train teachers to use technology, it’s really important to have a framing where anytime you’re using technology, it’s for a purpose,” said Budhai. “We call it ‘purposeful technology use.’ I tell students, ‘How does this help meet the learning objectives?’ Because if it’s not actually doing it, why are we using it?”