The 3 Questions Researchers Want Educators to Ask About Ed-Tech

Associate Editor

Philadelphia

An average of about 550 ed-tech products are being used in K-12 districts, but the efficacy of those products is rarely the reason any of them are in classrooms, a panel for the national meeting of the Education Innovation Clusters agreed last week.

How to change that is a conundrum for educators, researchers, and nonprofits alike.

Ed-tech products “are bought on the basis of marketing, not merit,” said Bart Epstein, CEO of the nonprofit Jefferson Education Exchange.

One way various groups are trying to have an impact is by simplifying the discussion. They recommend that educators focus on asking—and being able to answer—three key questions as they select and use an ed-tech product:

  • What does this technology claim it will do for your school?
  • How will you know if the technology is doing what it claims it will do for your school?
  • What evidence is there that the technology has done the same for other schools like yours?

“We need to develop a culture where these questions drive the adoption of technology, as opposed to just [ed-tech providers’] branding and sales,” said Epstein. He has been working with Digital Promise and the Chan Zuckerberg Initiative to encourage educators to use these inquiries. A webinar hosted by Digital Promise two months ago highlighted these questions and in June a group that included leaders of ISTE and other organizations discussed them, he said.

“I’ve talked to procurement officers who told me, ‘I’ve just renewed $1 million in ed-tech purchases and I have no idea whether they’re having an impact or not,” reported Joseph South, ISTE’s chief learning officer, as he moderated the #EdClusters18 session, which was called, “Who’s Taking On ‘Efficacy’ in the Ed-Tech Ecosystem?” About 80 participants from groups around the country broke out into work groups to discuss different approaches to the question.

Crowdsourcing Answers

To answer the question, “How will you know if the technology is doing what it claims it will do?”, educators will need help, the panelists agreed.

Teachers and administrators rely on their peers for opinions about ed-tech, not research. “What they don’t have is a place that aggregates the experiences of many educators, and what they don’t have is a place that somehow attempts to measure objectively the impact of those tools,” said South. “That’s what we’re all trying to figure out.”

Possible solutions come from organizations represented on his panel: the new Jefferson Education Exchange, where educators are paid stipends and receive support for documenting their experiences with implementing ed-tech; the ISTE Edtech Advisor, a members-only review and rating platform that gives educators insight on tools, apps, and resources to help inform decisionmaking, and the LearnPlatform, an online tool for districts’ ed-tech management and rapid-cycle evaluation; and the Ed Tech RCE Coach, a toolkit of resources for educators to execute a rapid cycle evaluation of an educational technology.

“By crowdsourcing the feedback from educators and administrators, we don’t need data from every teacher and every classroom,” said Epstein. When teachers’ feedback is collected across districts, it will be possible to find out how a certain product worked for thousands of algebra teachers, for instance.  

Lea(r)n has found that 548 ed-tech products are used, on average, by districts on its LearnPlatform. Schools are running “hundreds of evaluations on dozens of products,” said Daniel Stanhope, vice president of research and analytics at Lea(r)n. “Getting everyone at the table, and working together to do this, gives much better and more valuable evidence,” he said. He gave the example of schools that set up evaluations with product companies and partners like the Highlander Insitute, an education non-profit that focuses on researching, developing, and disseminating innovative methods to improve outcomes for all learners.

“Ask for evidence,” said Alex Resch, the deputy director of state and local partnerships at Mathematica Policy Research, to the education innovation cluster leaders. “None of the other things we do matter if no one is asking for evidence.”

Follow EdWeek Market Brief on Twitter @EdMarketBrief or connect with us on LinkedIn.


See also:

 

 

3 thoughts on “The 3 Questions Researchers Want Educators to Ask About Ed-Tech

  1. Billions upon billions of dollars spent on Ed Tech and, now, the questions are being asked about the programs merit. The Public Private Partnership cart is before the education horse yielding addictions and constant RF-radiation biohazards exposure to children and teachers in wireless classrooms via wireless access points and one-to-one devices.

  2. As a new player in the edtech field, but having been through both sides of the process, it has been my experience that carefully scripted demos in quaratined demo environments that focus on visuals and emphasise the best features of the products win the procurement race. The other key influencer is risk. Big business prefers to do business with big business as big vendors can afford to carry a higher amount of risk and are able to absorb the heat better than smaller vendors can. The net effect is smaller vendors with potentially superior products that focus on outcomes more than features are effectively shut out of the procurement process. Changing the procurement process and evaluation criteria to what problems the solution solves and how it solves them would be a major step forward for edtech vendors, schools and students. Unfortunately, that is not likely to happen any time soon and until then, stories like UltraNet and LMBR (AU edtech disasters) will keep popping up in the media.

Leave a Reply to Diane Cancel reply

Your email address will not be published. Required fields are marked *