District leaders who are eager to try out ed-tech products but wary of committing to big purchases sometimes stage pilot tests, so that teachers, administrators, and students can give digital tools low-stakes trial runs.
Yet many pilots fall well short of what school officials hope to accomplish.
As we’ve reported, companies and schools often go into those projects lacking clear goals. In other cases, teachers and students aren’t given enough opportunity to offer feedback on products. And even technologies that win raves from educators can get caught up in the bureaucratic slow march and not get purchased anytime soon.
While the primary audience for the web tool is K-12 officials, vendors can use it, too, in trying to structure pilots in ways that make sense for schools, according to Digital Promise, which has studied problems with pilots.
The framework takes district officials on an eight-step process for staging a pilot. K-12 officials should start by identifying their needs; then discover and select the product that is the right fit for their district; set plans with clear goals and a shared vision within the school system; ensure that teachers have the training to implement the tool effectively; collect data to make sure goals are met; analyze the data to decide whether to continue or drop the pilot, or make adjustments; negotiate and make decisions about whether to move toward a full-blown purchase of the product; and summarize and share results with pilot participants.
Each of those steps is outlined on the website, and each comes with resources to help districts, including guides and videos created by Digital Promise, and documents and resources from other sources, from individual school leaders to the federal government to research organizations who have studied product testing and evaluation, like Mathematica Policy Research.
Digital Promise created and organized the resources based on its review of nearly two dozen pilots conducted in districts over a two-year period, said Aubrey Francisco, the organization’s research director.
Some of the barriers in setting up pilots that pay off for schools and students don’t have easy solutions.
For instance, both companies and school officials complain that the timing and length of districts’ ed-tech pilots often don’t allow school officials—if they’re impressed by a piloted product—to carve out a place for that ed-tech tool in the next year’s budget.
Even if a product gets tested in the fall semester, district leaders may not know enough about its success to move to a full-blown purchase of it by January, when many school systems lay out initial budget plans—if they don’t start doing so even earlier.
Digital Promise doesn’t have a cure-all for that mismatch in timing, Francisco acknowledged. But the framework does offer resources, including model piloting-and-evaluation timelines, that could help districts’ decisionmaking.
The framework is meant to be a “step-by-step process to help school and district leaders,” she said. “Districts tend to rely very heavily on pilots, but the process they use tends to be pretty informal.”