Pearson’s Work on PARCC Common-Core Field Tests Scrutinized in Survey

Managing Editor

A report that reviews schools’ experiences with common-core field tests given earlier this year offers a mostly positive picture of those assessments—but also shows that K-12 officials were dissatisified or ambivalent about the execution of a number of tasks overseen by Pearson, a major contractor for the exams.

The review of the field tests, released this week, is based on surveys of test administrators, test coordinators, and students in states that took exams given by the Partnership for Assessment of Readiness for College and Careers, as well as based on direct observation of the exams, and other measures.

Representatives of Pearson and PARCC both described the field tests to Education Week as having fulfilled the goals of the practice assessments—one of which was to identify areas that need to be improved.

More than 1.1 million students in about 16,000 schools took the field tests this past spring, and PARCC states conducted the survey to gauge schools’ experiences with the exams. On balance, the trial run went “very smoothly,” the report concluded. (See my colleague Catherine Gewertz’s overview of the review’s findings posted on the Curriculum Matters blog.)

But the report also showed that school officials were disappointed with some aspects of the process—including procedures overseen by Pearson, according to the review, which says the company will be asked to improve in a number of areas. Among the report’s findings:

  • Just 28 percent of district test coordinators surveyed said that the TestNav8, described in the report as a  cloud-based system developed by Pearson for delivering tests via Web browser on various devices, worked well during the test administration. A larger portion of those testing officials, 40 percent, said they either disagreed or strongly disagreed that the TestNav8 system performed well, while 15 percent were neutral on that point.
  • Roughly 35 percent to 45 percent of test coordinators said they thought the process for establishing test sessions and registering students through PearsonAccess was straightforward, the report states. PearsonAccess is a consolidated online system for managing the PARCC assessments, the report explains. The system is used to manage student data for testing, manage staff accounts, or create computer-based testing sessions, depending on who’s using it, the PARCC report explains.
  • And only 42 percent of test coordinators said the information in the test-administration manuals was easy to find, and just 46 percent said the information was comprehensive enough. The report says that PARCC states worked with Pearson to provide those manuals.

The report outlines several steps that will be taken to improve the testing processes, based on the trial run, which include revising manuals and training modules; upgrading PearsonAccess and TestNav8; and conducitng an “independent verification and validation of TestNav8 performance.”

A Pearson official told Education Week that the company, which is based in New York and London, would work with PARCC states to make adjustments to its procedures, based on findings from the field tests.

“This is an innovative and new way of testing, including new types of assessment items, and that is why states and schools participated in the field test,” Pearson spokesman Brandon Pinette said in a statement.

“[M]any lessons and best practices were learned. Pearson is proud of the role we played in this historic accomplishment and is working closely with PARCC to ensure that the testing experience is continuously improved.”

Overall, the PARCC report found that no “system-wide” technology breakdowns occurred during the field test. Most of the tech snafus instead occurred at the local level, the document explains, citing firewall and computer settings that needed to be changed, and students who needed help logging in. Some of those problems were to be expected in school districts that were giving computer-based exams for the first time, the PARCC report said.

A spokesman for PARCC, David Connerty-Marin, said the process for using the field test to identify shortcomings—most of which the states regarded as minor—had worked as planned. He pointed out that it was the first time some of the PARCC states had been using computer-based exams on a large scale.

“The states, the testing company and schools participated in the field tests for just this purpose—to learn best practices and to see where we can make improvements before the 2014-15 administration,” Connerty-Marin said in an e-mail.

“The states were very pleased with the field test. The fact that we’re talking about how the test manuals need to be written more clearly and it should be easier to register students for the test—that is an indication that things went well and we were able to focus on important, but relatively minor, fixes.”

Leave a Reply