How AI Products Can Promote Child Development, Not Just a ‘Commercial Mindset’

Staff Writer

Vendors in the K-12 space need to consider more than whether artificial intelligence can increase productivity in the classroom. They must also weigh how the emerging technology can impact a child’s development, in the view of one researcher.

Education companies attempting to bring products using AI into the market should put a major focus on what is appropriate for different age groups, said Shelley Pasnik, senior vice president of external affairs for the Education Development Center, a global nonprofit. Those needs are likely to be overlooked, she said, if the dominant focus is AI’s ability to save school districts time or its capacity to produce new classroom resources.

“There is a lot of promise and hope about the expanded reach that AI may bring,” Pasknik said. But there are also risks in the “dehumanizing quality of AI, and also the lack of validity.”

Pasnik has long studied the impact of digital media on children’s lives. She formerly served as director of the Center for Children and Technology in New York and led the U.S. Department of Education grant-funded research program for PBS’s Ready To Learn. Her research related to AI is ongoing in her role with the EDC, an organization that designs, implements, and evaluates programs in the areas of health, education, and economic opportunity.

EdWeek Market Brief recently spoke to Pasnik about concerns over how AI could affect child development, how vendors should think about designing products using the technology, and what the term “developmentally appropriate” actually means.

What are the biggest questions you’re thinking about when it comes to AI and child development?

Some of the questions [that we’ve been looking at] have to do with systems. How might the introduction of … artificial intelligence change the educational system — and then also what experiences young people need to have in order to develop in a way that’s fruitful and healthy.

One worry is that that an artificial intelligence-enabled experience is going to disrupt learning, that it really is a move towards automation, a move away from what is human.

Why would that, on its own, risk undermining children’s development?

There’s also a worry that that environments themselves will start to dictate human behavior, that we will not be allowed to be our full range of selves. That certainly comes up [in AI] with prescriptive environments as well as the pernicious concern over bias. In an educational setting, or from the standpoint of a young person who’s just coming into her or his own, the worry is that the learner doesn’t get the benefit of really developing in a way that is constructive. That instead, these environments are going to be too limiting.

Another is that artificial intelligence and the way it’s being constructed by many of those creating products and creating services [is too focused on] a commercial and engineering mindset.

What do you mean by this emerging technology having a “commercial mindset”?

Having a finite answer. That can be everything from literally having the [technology give the] answer to a question, to a very narrow back and forth with a chatbot, for example. You might want, developmentally, to go back and forth. Instead [the use of AI]  is too narrow, too finite, when in fact learning needs to be much more open-ended.

Those parameters can be too constricting. It really undermines the larger effort of what learning is, that it’s meant to be about experimentation, be about the quest for knowledge and that process of acquisition.

From a developmental standpoint, it’s that building and deepening of relationships that is so critical for both the teachers’ experience, but but certainly for the for the students’ experience as well.

What does a less narrow experience with technology look like for students?

Is the design to have a product that always takes you back inside the product — that experience is kind of a closed loop? Like YouTube algorithms, [there’s the] constant enticement to go back, go back. When is the design to push back out and to really bring the teacher in or bring peers into engagement?

If the primary input is from a student into the system, responding to a question, either writing or answering some sort of prompt, when is there a mode that requires input from more than a single student?

We know that for systems of learning and spaces of learning to be successful, it’s not only the academic cognitive skills [that are important], but the development of the non-cognitive skills and the collaboration of resilience, the persistence of perseverance.

Designing for the interplay of students and teachers together, that’s difficult to do.

The term “developmentally appropriate” is widely used in education, without a lot of clarity on what’s actually meant. How do you think schools and the vendors should be thinking about it?  

“Developmentally appropriate” is thrown around, it’s a label that is slapped on a lot of offerings. And it’s helpful to slow down and think about what is developmentally appropriate and when is a product really meeting that standard.

For everything that goes in the classroom, we should have an understanding of how it relates to teaching. That’s priority number one. 

What is developmentally appropriate for three-, four-, or five-year-olds is very different than what would be developmentally appropriate for high school students, where they are in relationship to their own independence and their own ability to self-regulate. For a three-year-old, the concept of the world is really family. By four or five, maybe they’re starting to get a sense of neighborhood. That’s very different from a high school student, where you really want a diversity of opinions.

When you’re working within a model for younger kids, it’s important for a vendor [to not have] any introduction of of bias or erroneous information, if at all possible — and really, that needs to be airtight. Digital literacy is something that can happen at a later stage of development and with older students than is possible at a younger stage of development. That would be an example of where where a bias and inaccurate information could be really counterproductive for the learning that you’re really trying to support.

How should vendors be thinking about AI and child development moving forward?

Any product will be most successful if it is in this larger ecosystem of how our how our relationships are being nurtured. For everything that goes in the classroom, we should have an understanding of how it relates to teaching. That’s priority number one. Designing for the interplay of students and teachers together, that’s difficult to do.

This conversation has been edited for length and clarity.

Image by Getty.

Follow EdWeek Market Brief on Twitter @EdMarketBrief or connect with us on LinkedIn.

See more: