What Factors Drive or Doom Ed-Tech Implementation? New Study Taking a Look

Associate Editor

An effort to identify the 10 best indicators of whether or not ed-tech implementations will work in school districts is being launched by the Jefferson Education Exchange 

The nonprofit organization, whose goal is to help educators and education leaders make better-informed decisions about ed-tech, is starting an extensive study designed to give school officials a research-based understanding of what ed-tech is likely to work in their district, under what conditions, and why—and what is unlikely to work, when, and why not, said Bart Epstein, JEX’s president and CEO.

“The data right now suggests…that well more than half of the education technology going into our schools is either not being used properly, or not being used at all,” said Epstein.

JEX is recruiting now for a steering committee of educators to help identify what matters most in ed-tech implementation. This team will distill more than 70 variables that may be associated with the success or failure of an ed-tech implementation into the 10 most important indicators linked to achieving strong outcomes in classrooms.

The long list of indicators is grouped into 12 categories, including: technology perceptions; professional support, school vision and school culture; features of technology; student-centered factors; and concrete characteristics, like schools’ demographics and financial resources.

The six dozen variables of implementation that will be studied include:

  • Teachers’ beliefs about technology
  • Trust
  • Technology’s alignment with curriculum, content, and priorities
  • Administrative support
  • Teacher agency
  • Educator motivation
  • Student agency
  • Family buy-in/beliefs about technology
  • Instructional technology support
  • Effectiveness of technology
  • Measurable outcomes of technology
  • Ease of use

Once the committee chooses what they consider to be the key variables, a dozen working and technical groups will define each variable and identify how it will be measured in the field, according to Dan Brown, JEX’s director of national engagement, who will be managing the steering committee and working groups.

“Our methodology is to collect large amounts of data from actual practitioners, and then use the tools of big data to do a complex analysis to understand which variables are having the most impact,” said Epstein.

Ultimately, these variables will be used to create an “implementation framework” for ed-tech, designed to be the backbone for new research and tools to help teachers gain insight on ed-tech experiences from peers across the country who share similar contexts.

“This is a very complicated, long-term process,” Epstein said, “and we are likely to have helpful answers without knowing everything.”

Through its effort, JEX is trying to get educators and decisionmakers to understand “What are the experiences of people like me” with ed-tech, said Epstein.

JEX took up this issue originally when it hosted an “EdTech Efficacy Research Symposium” in 2017, where nearly 300 investors, academics, researchers, philanthropists and entrepreneurs met to hear about preliminary findings of 10 working groups.

Follow EdWeek Market Brief on Twitter @EdMarketBrief or connect with us on LinkedIn.

See also:


Leave a Reply