For all the talk about how artificial intelligence could transform what happens in the classroom, AI hasn’t yet lived up to the hype.
AI involves creating computer systems that can perform tasks that typically require human intelligence. It’s already being experimented with to help automate grading, tailor lessons to students’ individual needs and assist English language learners. We heard about a few promising ideas at a conference I attended last week on artificial intelligence hosted by Teachers College, Columbia University. (Disclosure: The Hechinger Report is an independent unit of Teachers College.)
Shipeng Li, corporate vice president of iFLYTEK, talked about how the Chinese company is working to increase teachers’ efficiency by individualizing homework assignments. Class time can be spent on the problems that are tripping up the largest numbers of students, and young people can use their homework to focus on their particular weaknesses. Margaret Price, a principal design strategist with Microsoft, mentioned a PowerPoint plug-in that provides subtitles in students’ native languages – useful for a teacher leading a class filled with young people from many different places. Sandra Okita, an associate professor at Teachers College, talked about how AI could be used to detect over time why certain groups of learners are succeeding or failing.
But none of these artificial intelligence applications are particularly wide-reaching yet, the transformation of “every aspect of the traditional learning environment” which will “usher in a bold new era of human history” that promoters have imagined.
There is also plenty of reason to worry about what might happen as tech developers accelerate efforts to bring artificial intelligence into classrooms and onto campuses.
Paulo Blikstein, an associate professor at Teachers College, drew laughs by talking about Silicon Valley’s public relations coup in getting us so excited about technology’s promise that we happily parted with our private data, only to learn much later of the costs. A handful of tech CEOs “caused enormous harm to our society,” he said. “I don’t want that to happen in education yet again.” Stavros Yiannouka, chief executive of the World Innovation Summit for Education (WISE), a project of the Qatar Foundation, and a panel moderator, agreed that there are great risks in letting artificial intelligence loose in classrooms. He pointed out, “You don’t need to have sinister objectives or plans for world domination to get things horribly wrong.” Andre Perry, a fellow at the Brookings Institution and a Hechinger contributor, talked about how tech companies may cement racism and other biases into algorithms unless they employ diverse teams and consciously fight against inequities.
As Blikstein noted, AI educational applications come in two types – tools that involve computers shaping how learning happens, and those that engage students in using AI to code and program. In a panel moderated by my colleague Jill Barshay, Stefania Druga, a PhD candidate at the University of Washington, discussed a platform she’d created called Cognimates. It enables children to use artificial intelligence to train and build robots.
Druga talked about how kids first assumed the robots were super brainy. But once students learned how to train a robot, she said, “their perception goes from, it’s smarter than me to, it’s not smart, significantly. We see that kids become not only more critical of these technologies but also more fluent.”
She mentioned the creative and unexpected projects students wanted to tackle, including building a chatbot that gave back-handed compliments (a concept that Druga, who grew up in Romania, wasn’t initially familiar with). “We need more silly instead of smart technologies,” Druga said, “that puts the focus on people and allows people to do what they do best.” In her evaluations of Cognimates, she found that students who gained the deepest understanding of AI weren’t those who spent the most time coding; rather, they were the students who spent the most time talking about the process with their peers. That left me thinking that it’s from other humans that we tend to learn the most — and peers and teachers will always play a central role in education.
Editor’s note: This story led off this week’s Future of Learning newsletter, which is delivered free to subscribers’ inboxes every other Wednesday with trends and top stories about education innovation. Subscribe today!
This story about artificial intelligence was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s newsletter.
At The Hechinger Report, we publish thoughtful letters from readers that contribute to the ongoing discussion about the education topics we cover. Please read our guidelines for more information. We will not consider letters that do not contain a full name and valid email address. You may submit news tips or ideas here without a full name, but not letters.
By submitting your name, you grant us permission to publish it with your letter. We will never publish your email address. You must fill out all fields to submit a letter.