Learner agency in AI-driven, narrative-centered learning environments: An interview with James Lester and Jonathan Rowe

Interviewed by Deblina Pakhira

In this post, Dr. Deblina Pakhira, Research Manager at Digital Promise, talks with Dr. James Lester, Principal Investigator and Director of the EngageAI Institute and a faculty member in Computer Science and Director of the Center for Educational Informatics at North Carolina State University, and Dr. Jonathan Rowe, Managing Director of the EngageAI Institute, and also a senior research scientist in the Center for Educational Informatics at North Carolina State University. Throughout their conversation, they discuss learner agency in narrative-centered learning environments.

Key Ideas

  • Communication and collaboration between learners and machines will help learners achieve more distinctive and humanized results, compared to what machines can achieve alone.
  • EngageAI Institute’s research is exploring the effect of AI technologies on learner agency. 
  • A vision for the future of AI is to address and combat the systemic problems in education.

Interview

Deblina Pakhira: Please tell us a bit about yourselves and what inspired you to participate in this work. 

James Lester: For many years my work has focused on artificial intelligence (AI) and education to explore how AI can be leveraged to improve human learning. At the Institute, we have three areas of focus: narrative-centered learning environments, embodied learning agents, and multimodal learning analytics. In 9th grade (1970s) I took a computer science course and the very first creative programming I ever did was not for an assignment, but just for fun. I created a text-based adventure game; depending on the choices that the user would make, different events would happen in the narrative that would take them through an interactive story. As both the creator and user of this game, it was the beginning of my interest in interactive narrative.

Jonathan Rowe: As a computer scientist, my research background and interests are in the intersection of AI for educational technology. Many years ago, my dissertation work leveraged the idea of AI to create dynamically generated narratives that students can participate in and interact with using game technologies. A formative experience that shaped my interest in this area occurred in elementary school. My elementary school teacher, Mr. Paul Strubeck, orchestrated a narrative learning activity where I collaborated with other students to solve math or scientific problems, often using computers, to help advance the shared storyline. This stuck with me, and now at EngageAI, I want to reproduce, scale, and dynamically construct those sorts of learning experiences using AI technologies.

Critical Role of Learners in this Era of AI

DP: There are recent conversations on how ChatGPT is a potential threat to learner agency and free will. In this era of AI, do you think it is important for learners to control and monitor their own learning? Why or why not? 

JR: There will be successors to ChatGPT, or other types of AI systems, and their capability will get better or emerge over time. The systems are able to perform tasks that we currently think about as being uniquely possible with human intelligence, but now can be solved partially or fully with machine intelligence. It will become very important for learners, and people in general, to have a high level of awareness of what distinctive knowledge and capabilities they possess that machines do not. This relates to the idea of being able to work effectively with AI systems, which I think will be increasingly important over time. I believe that this will have profound implications for education. It will be more important than ever for students to be able to monitor and regulate their learning, especially, what they are learning and how they operate effectively in an AI-infused environment. 

JL: To Jon’s point, I believe that communication and collaboration are two things that will be considered foundational skills we will want to cultivate in learners to best prepare them for the future. Communication and collaboration, both between learners, but also between learners and the machines they use. These skills along with an overlay of cognitive awareness and self-regulated learning will become increasingly important for learners. 

JR: To add to our previous points, AI systems today are dependent on the data that they are provided, and the data are created by humans. The datasets are aggregated and then used to train the AI models. This process shapes the style of output that can be generated by the system. I think there will be a whole space for learners to understand and explore what they can generate that is distinctive, and very human, compared to what machines can generate. I think that highlights the critical role that learners or people, in general, will play as these generative AI systems become more capable. 

Enabling Learner Agency in Diverse Contexts 

DP: This institute focuses on stories and narratives through the use of generative AI. How do you envision the work in the EngageAI Institute will support learner agency while taking into consideration the diverse contextual factors? 

JR: Generative AI provides a rich and rapidly advancing set of capabilities that enable story-based learning experiences. It unfolds in ways that adapt and respond to the students’ choices and actions in a learning environment. So, one major research focus of the EngageAI Institute is interactive narrative generation. Think about an immersive learning environment, like a game-based learning environment, where there might not just be a single linear path for the player or learner, but instead there is a range of possible learning paths. The system is not just sequencing or delivering different possible pre-prepared paths, but providing dynamically constructed storylines, character behaviors, and virtual environments. The system wants to preserve the scientific content and problem-solving tasks that are embedded in the storytelling while aligning the narrative so that it is relevant to the learners. Our goal is to provide authentic, relevant, and meaningful learning opportunities in these sorts of narrative contexts that can enable learners to feel a strong sense of agency. Embodied conversational agents are another one of our research strands. We are developing agents that are designed to empower students to ask for what they want to learn about, or what type of help they might want at a particular point. I believe that AI in the education space can be agency-enhancing as opposed to agency-reducing. 

JL: In different learning contexts, for example in a museum vs. a classroom, agency can manifest differently. In a free-choice environment, visitors are able to make decisions about not only how they will interact with an exhibit, but whether they will interact with an exhibit at all. In these contexts, learner agency and engagement are very tightly integrated. Integrating another element of technology, AI can fundamentally affect the experience that the student has with respect to agency. Exactly what those mechanisms are is a great research topic that we’re investigating. The next question is, how does it affect the learning experience? Imagine how an AI-driven experience manager can create these learner experiences that are best for a learner at a specific moment in time, in a particular learning context. And then how can that lead to the greatest engagement that leads to the greatest and deepest learning? These are really interesting questions that we are empirically investigating at this time. 

JR: AI can also help to improve the quality of learning and that itself promises to increase a learner’s agency by opening new doors and possibilities. One issue which we need to address is the issue of bias in AI models. Whether it’s the training, the deployment, or the interpretation of AI models, bias will have the effect of reducing the agency. The negative impacts of AI bias on agency for individuals or groups are also addressed by the EngageAI Institute’s framework of ethics, equity, diversity, and inclusion

Looking Towards the Future 

JL: There are systemic problems in education, and AI can help address some of the most serious problems that we see in learning at scale, such as K-12 attendance, assessment (different ways to measure learning outcomes rather than high-stakes standardized tests), and workforce development. People can say that AI is problematic because we’re changing what kinds of competencies the workplace is going to reward. However, AI can also become a part of the solution such that we can create learning experiences that are tailored to the individual needs and interests of learners. AI will play an important role in creating learning experiences that will prepare students for what’s going to be an amazing future.