A Future Without Summative State Tests? NWEA Touts New Product
The assessment provider NWEA has announced ambitions to develop a new series of “through-year” tests that it predicts could replace annual, year-end summative exams that are scorned by many teachers and parents.
Whether the nonprofit testing provider can accomplish that feat—and whether its strategy would improve upon the testing approach that states and districts are already using—remains to be seen.
NWEA says its plan, which it unveiled this month, will reduce overall testing time, and make the results of state-administered tests more useful to educators.
Many school districts today are already giving “three assessments a year to students, and then at the end of year, a state assessment on the same content,” NWEA CEO Chris Minnich said in an interview.
“We’re gathering all of this information on students — and we’re not doing it to the benefit of the students. We’re just asking them to take assessments,” he said. That approach doesn’t produce useful information for schools, educators, and parents, Minnich argued.
One of the most common complaints about summative tests is that they’re given to students at or near the end of the academic year, which educators say makes them of little use in trying to figure out where students are struggling, and how customized lessons would help them.
Under its new approach, to be rolled out over the next two academic years, NWEA would give three computer-adaptive tests—in the fall, winter, and spring—similar to the schedule used for interim tests by many districts.
The exams would take about 45 minutes each, for each subject tested.
NWEA will market them to individual states, which run the accountability systems that count the scores on summative tests, said Minnich. The exams will be made available for grades 3-8 English/language arts, math, and science.
NWEA would use the results of some combination of those through-year assessments, based on either the third test given during the year, or some combination of the three yearly exams, to come up with the equivalent of a summative test score. The testing group says it would work with individual states to determine which of the through-year exams would be factored into the summative result.
As a result of using a measure of student learning on the three smaller tests, the summative tests typically given at the end of the year—which NWEA says can take up to three hours per subject—would no longer be necessary, the organization says.
Pressing to Cut Testing Time
Scott Marion, the executive director of the Center for Assessment, questioned the feasibility of NWEA’s plan. It’s not clear that the new, through-year tests will truly have a sophisticated enough design that is capable of producing a valid, summative score for students, he said.
The process for doing that is “not that straightforward,” he said.
Marion also was skeptical of how easy it will be for NWEA to develop a rich array of test items that can provide an in-depth picture of students’ grasp of academic content.
While the goal of shaving time off of tests might be a good one, the risk with 45-minute assessments is that schools “get what they pay for”—which is less useful measures of student progress, said Marion.
“You’re going to tell me that [the new, short tests] will provide instructionally rich information?” he asked.
If states and districts are relying on the shorter tests for a true measure of student knowledge, Marion added, those assessments need to cover “the depth and breadth of a content area.”
The new tests will roll out over time. NWEA says they will be researched through pilots conducted among districts in Nebraska and Georgia during the 2020-21 academic year, with the goal of implementing them during 2021-22. (The Georgia effort is supported in part through a federal program meant to develop innovative assessments.)
NWEA already offers a set of interim tests given throughout the year, known as MAP, that are meant to gauge student academic progress over time. Those tests will continue to be sold to school districts, NWEA said. Minnich said that NWEA is not proposing to use its existing MAP tests as the new, through-year assessments—because the MAP was not designed to serve that purpose. (In fact, many test experts have raised concerns about the validity and feasibility of combining interim tests into a summative score.)
The new, through-year assessments, will be designed to accomplish what the existing interim tests cannot, NWEA says.
The tests will have close alignment to individual states’ academic content standards, and students will be asked to “fully demonstrate” their knowledge of those standards through a broad array of test questions, said Abby Javurek, the senior director of large-scale solutions at NWEA.
“We build around that, and build around states’ assessment blueprints, to make sure the assessments are covering all the things they need to cover,” Javurek said. NWEA wants tests to measure how students’ “level of thinking is becoming more sophisticated as they learn more throughout the year.”
Grade Levels, and Beyond
One of the challenges facing NWEA—which it hopes to address in piloting the new exams—is ensuring that they are testing students on skills they should know at various grade levels, while also measuring the skills of students who may be well above or below grade level.
“Getting that balance right is what we’re trying to test out,” Minnich said.
The bulk of NWEA’s through-year tests will consist of both multiple-choice or those that call for fill-in-the blank responses. It will also include other “technology enhanced” items that can assess higher-order thinking, the organization said in an email. The items will be scorable by machine, and adaptive, meaning content will change based on test-takers’ responses.
“Our primary design function is to make sure teachers get information on their kids as quickly as possible,” said Minnich.
NWEA says it also will make performance tasks available in its through-year tests, as a complement to the computer-adaptive test. Performance items will be made available as an option to individual states.
To provide an accurate view of students’ learning, NWEA will not only need to match its new tests to individual states’ standards, but also create tests that are sufficiently complex to capture the full depth of those standards, Marion said. It’s difficult to hit that mark with tests that are heavily reliant on multiple-choice items, he said.
Almost all states today have standards that seek to demand “deeper thinking” of students, Marion said. Without knowing if NWEA’s proposed tests can meet that bar, it’s impossible to know if there’s a “devil lurking in these details.”
But Minnich predicted that NWEA will craft tests that deliver for states an accurate picture of student learning—even if it doesn’t come at the traditional time they’re used to on the academic calendar.
“We’re trying to get the best of both worlds,” Minnich said. “The states have traditionally said, ‘Hey, the only determination we can make is off the end-of-the-year test—what the students know in April. What we’re saying is, If a student knows it, and they can show us in January, shouldn’t that count?”
Photo: Some schools in Georgia, including Newnan Crossing Elementary School are piloting a new education assessment program called Kindville which looks-and plays-just like a video game, but will eventually spit out qualitative math and reading scores.
Follow EdWeek Market Brief on Twitter @EdMarketBrief or connect with us on LinkedIn.
See also:
Does Scott Marion really think the currents test produce valid, summative results? Does he think the data is returned in time to be useful to teachers? Does he really think they contain an adequate supply of quality, reliable test items? Does he think the long days of testing students until they are in tears produces useful results? If he does, he’s the only one.