Want to design better AI tools for education? Listen to the students!

By Megan Humburg1, Ph.D., and Dalila Dragnić-Cindrić2, Ph.D.

1Indiana University; 2 Digital Promise

Key ideas:

  • Students recognize the potential of AI for personalized learning and have sophisticated insights about how they want AI to support their own learning.
  • Students need support in learning how AI is built and trained, as well as how to interact with it productively in different forms.
  • Students have valuable, concrete ideas about the design of classroom AI tools, and including them early and often in the design process is key to building effective AI technologies for education.

Our teams at the EngageAI Institute are working to build a narrative-centered, AI-driven educational game where students can investigate scientific problems by exploring a game world, chatting with different characters, and gathering evidence to solve a mysterious illness outbreak. 

As part of our design process, we recently conducted focus groups with middle school students to help us understand how AI-driven aspects of educational games might get them excited about science and help them learn. Students played the prototype version of the game for approximately 20 minutes and then chatted with us about game design, AI, and learning. 

Here’s what we’ve learned so far…

Students recognize the potential of AI for personalized learning

Students were excited about the potential for AI to adjust learning experiences to what they know, what they’re interested in, and when and how they might need to be supported.

We noticed that after playing the game, students were eager to offer feedback on the ways that AI might be able to adjust their learning experience and provide personalized versions of the educational game aligned to each student’s interests, skills, and background knowledge. 

The students chatted about how what is interesting to one student might be incredibly dull for another. For example, one student talked about how much they enjoyed learning about everything to do with nature, while another said their family spent so much time gardening that the topic of plants and nature was too familiar to be exciting. Students expressed a desire for adaptable learning experiences that are more fine-tuned to their particular interests and knowledge. 

For the two of us, as researchers and educators who are invested in the outcomes of this research area, it was encouraging and inspiring to learn that the students saw highly personalized learning as a worthwhile aim for the design of AI tools. 

Students need support in learning how to interact productively with AI in different forms

Given the popularity of mainstream AI-driven chatbots such as ChatGPT, students need access to resources about how AI-driven technologies are designed and built and what different types of AI can and can’t do.

One key struggle that students expressed in our discussions (which we have also experienced ourselves!) was not knowing how exactly to interact productively with an AI-driven character. Through our reflections, we recognized that much of the experience students have with AI comes in the form of chatbots such as ChatGPT, where the default mode of interaction is to ask as many off-the-wall questions as possible to test the limits of the bot’s so-called “intelligence.” Students in our focus groups interacted with our AI-driven character in a similar fashion, asking questions such as “When did Abraham Lincoln die?” and “What is a mammal?” to see if the character could answer a variety of queries the way ChatGPT can.

However, the character in our game was designed to be an expert in epidemiology, so students became increasingly frustrated when they received responses saying the character didn’t know about history or mammals. 

“The students debated whether or not the AI was “intelligent,” given that it couldn’t answer what some considered basic questions.”

One student countered this argument by saying, “It’s gonna have AI for, like, medical situations, and like, illnesses and diseases […] It’s not gonna be like Google.” 

The students’ reactions to an AI agent with a limited scope of knowledge brought up an interesting question for our team to debate about whether AI tools should be designed like a human (knowing some ideas but not others) or like a search engine (knowing anything that can be easily found on the internet). To us, these discussions also highlighted that students did not fully understand how AI tools are built and trained and thus were not sure how to evaluate what an AI should know or do.

Students have valuable, concrete ideas about the design of AI tools for the classroom

The questions students are asking about AI in education are much like the ones the rest of us are asking: When is AI useful? When is it not? What is it for?

We were impressed that, despite their limited knowledge about AI, students were able to bring up compelling questions and tensions around the purposes of AI tools in the classroom—and when they’re worth the time and the energy to create and use. 

“One idea that kept coming up was the belief that an AI tool is most useful when it is able to interact in supportive, human-like ways and help the students regardless of their initial knowledge levels.”

While admitting that they had limited technical knowledge, one student offered insightful ideas for how to make the AI-driven characters easier to interact with. They suggested integrating examples of meaningful interactions with the character to help students who had trouble interacting with the AI learn how to formulate good questions about science topics and practices. As a result, we’re currently implementing a version of this idea in the updated game. 

Another student cautioned against designing AI tools that don’t offer any unique contributions to a learning environment, saying, “But it’s just, like, the wrong place to put AI. Like, I can understand AI in a chess engine, where you’re looking for competition. But it’s just like putting too much…yeah, no offense, it’s like putting glitter on a pig, because it didn’t really need that glitter.” 

We were thrilled to hear students wrestling with the same questions we were as researchers and educators, reflecting on how an AI tool is only as powerful as its ability to do something that can’t already be done well without it.

Including students early on in the design process is key to building effective and engaging AI-driven educational technologies

The development of AI technologies for education is rapidly expanding and shows no signs of slowing down. While we are excited about the possibilities of AI in education, we also want to ensure these tools are designed with care and intention. If we want AI to have a positive and useful impact on students’ learning and engagement in classrooms, we need to include them as thought and design partners throughout the development process. Our conversations with students showed that students have important ideas to share about when AI is useful for their learning and when it isn’t. Talking with students early and often reveals potential challenges for the design of AI tools that can be solved before they become lasting problems. Listening to students’ ideas about AI can also give us insight into where we might need to design educational supports to help students learn how AI works—and how to work with it.

To learn more about the work we’re doing at the EngageAI Institute, check out these other blog posts:

Are you an educator who wants to have a voice in the design of AI tools for education? You can contribute your own ideas by signing up to participate in one of our Teacher Focus Groups.