Lessons for Ed-Tech Companies in LAUSD’s Far-Reaching AI Experiment

Staff Writer

San Diego

Joanna Smith-Griffin, CEO & Founder of AllHere
Joanna Smith-Griffin, CEO & Founder of AllHere

Last month the Los Angeles Unified School District announced that its school system of more than 560,000 students will use a new AI-powered learning tool.

This product, named “Ed,” was developed through a public-private partnership with AllHere, a Harvard Innovation Labs venture, and will provide students with access to information like grades, attendance, bus schedules, and outside resources for learning.

LAUSD’s collaboration with AllHere is one case that has resulted in a tool that will serve as a 24/7 learning assistant to the district’s students and their families. The project’s rollout will almost certainly be closely watched by leaders of other school systems who are curious about practical ways they can use AI to improve classroom instruction and operations, but also wary of the downsides associated with the technology.

At the annual ASU+GSV Summit this week, AllHere CEO and founder, Joanna Smith-Griffin spoke to EdWeek Market Brief about her organization’s efforts to make the partnership work, and what it takes to get one of the largest school districts in the country to make a bet on the power of AI.

What does it take to set in motion an ambitious partnership of this sort with one of the biggest school systems in the country?

Coming in with as much of a perspective of being willing and eager to learn is important. So often in ed tech, it’s easy for vendors to create a product that they think is fully done and well-packaged and just present it as an output. One of the most unique attributes about a public-private partnership is that you involve the partner entity as almost like an equal, or even more so, in shaping the creation and the development of a tool.

The essence of public-private partnership is finding those most authentic avenues by which your partner, in this case the [560,000 students in] LAUSD and their parents and their staff, could most inform the development of the tool itself. It is a slower way to go, but there is this quote that says, “If you want to go fast, go alone.” But if you want to go far, think about how you can work with the field to produce innovation where the sum is greater than the parts.

So often in ed tech, it’s easy for vendors to create a product that they think is fully done and well-packaged and just present it as an output.Joanna Smith-Griffin, CEO, AllHere

You mention collaboration. What kind of collaboration with other communities was needed to get this off the ground? 

We use parent voice heavily. We did focus groups within interviews at every stage of the creation process for Ed, where the emphasis was not so much on asking them what they did like, but even more often, “What’s not helpful here? What don’t you like?” So that we were always able to keep this tight alignment between the problem that we were trying to solve for and the end [product], and really centering the creation of a tool in user voices.

We did the same for kids as well, asking them about what it was like for them to navigate through all these different platforms. Kids have access to over 2,500 different ed-tech tools a day, and even considering single sign-on, that’s a lot of cognitive load for them to have to navigate all those things. It was only through prioritizing student voice that we were able to be aware of that as something that was top of mind for kids.

Teachers have an inherent right to reject using anything that’s not adding value to the critical work that only they can do. But it raises the bar for companies on what they have to do in order to earn the opportunity to be used in a teaching space.

Your product, Ed, emphasizes capabilities in personalization, which has been a goal of schools — and companies — for years. Where are other areas of opportunity for AI when it comes to individualizing learning?

One is the orchestration of tools. As I mentioned, a student on average has access to over 2,500 different ed-tech tools each day, and AI has just lowered the barrier for the creation of new tools on the market. One of the ways where AI is very helpful is creating a thin layer on top of different platforms, different tools to guide a student’s path through them based on their strengths, as well as need. How do we take every single vendor partner’s tool as an input, but then use AI to create new relationships in between each of those tools to better serve student outcomes?

Another area of opportunity is AI for continuous measurement. A lot of schools rely heavily on assessments at the beginning of the year, middle, and end, but if there are tools that make assessment feel like just a part of student’s daily work as opposed to an event — how do you go from assessment as an event to constantly assessing in ways that feel natural and non-burdensome to the end-user, and are still being used to create adaptive experiences?

What other doors does AI open?

The third is hyper-targeted intervention. Prior to me doing this work, I used to teach 6th and 8th grade math. Even in my one class of 30 students, I may have had 15 different levels of performance on any one given skill. The work of an educator in that space is to do what only they can do. You’re trying to teach to the median, or even the mode, of where those learners are at the same time in the same space.

So there’s still a lot of room to define the authentic copilot [tools] for an educator. I haven’t seen a product yet that takes an educator voice in a central part of determining what that is, so I think that’s a ripe area for opportunity.

What does it take to get district, school, and classroom leaders interested in AI at this time?

Teachers are already asked to use so many different tools, and that creates glut, if every different day, it’s a different widget, a different tool. It does lead to a sense of fatigue around what is out there. School systems like Los Angeles Unified are thinking about how does AI, when implemented in either an instructional or an operational capacity, actually take a load off of teachers’ plates. Products that aim to do that as a core value may find more traction than ones that add innovation or buzz. That’s going to force companies to really have to think about how do you learn about what problems are today’s teachers trying to solve, and whether the use of AI will actually add value to people’s roles?

Teachers have an inherent right to reject using anything that’s not adding value to the critical work that only they can do. But it raises the bar for companies on what they have to do in order to earn the opportunity to be used in a teaching space. School systems’ and educators’ sense of standards for what constitutes a product of value, enough to use in my classroom, will increase.

What other kinds of concerns are you hearing from districts that are posing as barriers for them to experiment with AI?

[Districts] have to create policies, standards, and guidelines over everything from the governance of data to also defining what AI tools can and can’t have access to, in terms of student information. They also need to do work to make sure that the vetting of any AI tools honors existing law. A lot of AI is embedded in a lot of tools that they have already purchased but just weren’t marketed in that way.

Another important concept is, what are the provisions for human oversight, human supervision, and also human intervention if an utterance comes into a chat that is emergent? The possibility of harm to self or harm to others is one example. Another one is filters in terms of content. It’s being proactive if you’re using an AI-based tool that involves chat: What safeguards are in place to create a protected space where students and parents can ask questions about certain topics, and maybe not about others?

From your experience developing Ed, what sets apart a successful AI tool?

We’ve been able to create new relationships between different ed-tech tools on a single learning path for kids in a way that feels joyful and fun, like a game, and that level of interoperability. It creates new possibilities for meeting every student where they are in a hyper-personalized manner, in ways that just haven’t existed up until this point, and with major content partners. I’m excited for when we create a one true common sandbox for all these different entities to play and work in. It’s good for kids in terms of what their learning experience looks like, and it will be good for student outcomes.

[It’s also] determining that [tools] are safe and secure for kids and for families. AI presents a host of new questions that school systems need to tackle, including digital citizenship … and building literacy around AI as a key component.

Follow EdWeek Market Brief on Twitter @EdMarketBrief or connect with us on LinkedIn.

Image by Getty.

See also: