New Institute Backed by National Science Foundation to Explore AI’s Role in Education

Staff Writer

A ed-tech nonprofit will join four universities in launching a new institute dedicated to creating artificial intelligence tools that can be applied to human learning and education.

The effort is meant to encourage the development of products for use in K-12, influence future AI products made for K-12, and is intended to improve upon past AI technologies that were difficult for teachers to use, said Jeremy Roschelle, executive director of learning sciences research for Digital Promise, the education nonprofit involved in the initiative.

“There’s an emphasis here on what people are calling classroom orchestration – how to help teachers organize for longer-term, more complex, collaborative, problem-solving things,” Roschelle said. “I think the classroom orchestration part, in particular, could be part of a big change in what products people emphasize in the market, and how they support teachers.”

A 5-year, $20 million grant from the National Science Foundation will support the AI Institute for Engaged Learning, Digital Promise said. Analysts, policymakers, and product developers from Digital Promise will join researchers from North Carolina State University, University of North Carolina, Indiana University, and Vanderbilt University, for the initiative.

The work of the institute will have three main goals:

  1. Created platforms will incorporate story-based problem scenarios fostering communication, teamwork, and creativity.
  2. Platforms will generate AI characters capable of communicating with students through speech, facial expression, gesture, gaze, and posture.
  3. The institute will build a framework that will customize educational scenarios and processes to help students learn, based on information collected from conversations, gaze, facial expressions, gestures, and postures of students as they interact with one another, teachers, and the technology itself.

Schools, museums, and outside nonprofits will work with the institute to ensure created tools are ethically designed and advance diversity, equity and inclusion, according to the announcement.

District officials, and advocates for the ethical use of technology, have raised repeated concerns about potential pitfalls in applying AI-powered technology in schools. One fear is that because AI systems are dependent on collecting large amounts of data and using algorithms to guide policy and classroom practice, they will end up reinforcing racial, gender or other stereotypes.

For example, could an AI-powered curriculum platform, or one that recommends academic interventions for students, end up directing more students of color into remedial coursework, because of biased algorithmic assumptions?  (See Education Week’s recent special report breaking down concerns about AI’s role in classrooms.)

Data Privacy in Focus

A November report by the Center for Integrative Research in Computing and Learning Sciences cites several concerns and considerations come into play when it comes to how AI technologies safeguard student privacy.

How will AI-recorded student conversations and emotional data be used? How long will information be saved? Will it be part of a student’s record? These are all questions that come into play when AI and children interact, the report notes.

AI detection of emotions, through facial expressions, is well-developed, though challenging from a privacy and ethical standpoint, and appropriate policies must still be determined to address these challenges, the report says.

“A very strong focus of this institute … is coming together to really think about how do we tackle some of these issues of privacy, security?” Roschelle said. “None of this is going to fly if people are terrified.”

If AI can be applied creatively and responsibly, it has the power to enrich lessons across subjects, Roschelle said.

He offered an example detailing how forthcoming AI tools might generate story-based situations that promote collaboration and creativity.

Imagine a science class planning a trip to Mars over a three-week period, he said. For the purposes of that trip, they would need to measure gravity, the strength of the Sun’s energy, and air moisture. They would have to plot out measurement devices that they need, the composition of student teams to observe measurements, and what vehicles to bring.

In this case, an effective AI system could “help them along the way whenever they get stuck,” Roschelle said, and “tune the story to the choices they make.”

Image by Getty

Follow EdWeek Market Brief on Twitter @EdMarketBrief or connect with us on LinkedIn.


See also: