behind the scenes

What a day is like inside Pearson’s test scoring facility

Facing widespread backlash after years of controversies and testing glitches, one of the world’s largest testing companies is taking an unusual approach to quieting critics: It’s opening its doors.

Pearson, a British-based testing conglomerate that recently signed on for a two-year contract to aid in writing and administering Indiana’s ISTEP test, today invited Indianapolis reporters to its north side scoring facility in an effort to reveal how the company hand-scores questions on hundreds of thousands of student exams.

It’s part of a charm offensive from a company that depends on public contracts but is often at the center of public debate over testing in schools.

“We don’t see any reason why we shouldn’t pull back the curtain,” said Scott Overland, Pearson’s new director of media.

With parents and educators often skeptical about how exams are made and scored at a time when test scores can influence everything from teacher bonuses to whether schools are allowed to continue to operate, Pearson is hoping to allay fears by better explaining its work.

“We completely understand how this whole assessment process, the knowledge that parents and educators and the public has about that is limited,” Overland said as he led reporters through Pearson’s scoring facility on the third floor of a mid-sized office building.

The tour featured a short walk through the floor, which consisted of three large rooms and several small offices and conference rooms. Pearson executives and officials in charge of the scoring process explained how scorers, who must have a four-year college degrees, are recruited by reaching out to retired teachers, tutors and other educators. They receive about seven hours of in-person or online training and most learn to score just one open-ended question — including essays and math problems that require students to show their work to demonstrate that they understand the concept.

Multiple choice questions are graded by machine at Pearson’s main scoring facility in Iowa.

Pearson execs on the tour showed reporters how scorers log onto a computer platform where they see scanned images of student papers they must assess and grade. Their work is tested by “validity” questions that supervisors use to test their scorers for accuracy.

Scoring supervisors sit quietly at rows of tables in front of boxy computers in the scoring center. They’re in regular communication with the scorers themselves, who typically work from home across Indiana. Because scorers and supervisors don’t necessarily work regular business hours, many tables were sparsely filled Thursday morning.

Allison Tucker, a scoring supervisor for fourth-grade reading who’s been working with Pearson for more than 10 years, said one of her graders might do 70 questions in an hour. If a scorer gets off track and starts grading incorrectly, Tucker said that’s where the supervisors can step in.

“That’s one of the things that we really take seriously,” Tucker said. “So far it hasn’t been a problem for us.”

Few businesses in the education world are under as much day-to-day scrutiny as testing giants like Pearson, since just a handful of companies compete for lucrative state testing contracts and the chance to sell associated test prep materials for those exams.

Pearson is the largest education company in the world and a leader in standardized test development, having nabbed a contract for the multistate, Common Core-linked PARCC exam and one to handle the scoring for the National Assessment of Educational Progress (NAEP).

Yet it’s an industry frequently under fire when errors are discovered among millions of test questions or when problems arise with scoring or computer testing platforms. Every few weeks during standardized testing season, critics can seize on headlines reporting computer malfunctions or other testing disruptions.

Just yesterday, an employee error caused widespread test cancellation of New Jersey’s PARCC exam.

The problems aren’t limited to Pearson. Indiana’s 2015 ISTEP test, which was haunted by glitches and scoring delays was administered by California-based CTB, a Pearson competitor. CTB also ran into problems in 2013 when about 78,000 Indiana students taking the test on computers were interrupted over the course of several days — an error that forced CTB to pay $13 million in damages to the state.

Indiana then dumped CTB and hired Pearson last year with a more than $30 million contract to administer the 2016 and 2017 ISTEP exams, but the state is now looking to create yet another new exam for 2018.

The new exam will surely generate another a sought-after testing contract. So Pearson could be treating the ISTEP as something of an audition, trying to make a good impression in hopes of ongoing work.

“We recognize very much that this is critically important work we are doing,” said Melodie Jurgens, who oversees the general scoring process. “Our scorers are quite passionate, and they care a lot about how students do. They want to get it right because they know it’s important.”

Indiana is one of the first states where Pearson has invited reporters to tour its facilities, though earlier this week Overland said some national news outlets were given tours of the Iowa facility. The company hasn’t used such strategies in the past, he said, but plans to open up tours in other states going forward.

Granting this level of access to reporters isn’t a common move for testing companies, said Bob Schaeffer, spokesman for The National Center for Fair and Open Testing, an organization that acts as a testing watchdog. He said he’d been contacted by another reporter about a similar tour this past week but had never heard of this approach before.

But given the challenges Pearson has faced recently — including the loss of three major testing contracts in Florida, Texas and New York — it’s not necessarily a surprise.

“All the major testing companies have had computer testing failures,” Schaeffer said. “It shows an incredible pattern of technological failure that is more than the isolated glitch that they like to make it seem.”

Since Indiana switched to Pearson this year, things have gone relatively smoothly. The state officially started its second round of 2016 ISTEP tests this week, and few problems have been reported.

But Schaeffer said Indiana has “jumped from the frying pan into the incinerator” by making its test vendor switch.

“It’s a perverse game of musical chairs in which a state might reject a contract with a vendor for doing a bad job and hires a new vendor who has time available because they just got fired from another job,” Schaeffer said.

TNReady snag

Tennessee’s ill-timed score delivery undercuts work to rebuild trust in tests

PHOTO: Laura Faith Kebede
The Tennessee Department of Education worked with local districts and schools to prepare students for TNReady, the state's standardized test that debuted in 2016.

After last year’s online testing failure, Education Commissioner Candice McQueen pledged to rebuild trust in Tennessee’s new TNReady assessment, a lynchpin of the state’s system of school accountability.

A year later, frustration over TNReady has re-emerged, even after a mostly uneventful spring testing period that McQueen declared a success just weeks ago.

Preliminary TNReady scores are supposed to count for 10 percent of students’ final grades. But as many districts end the school year this week, the state’s data is arriving too late. One by one, school systems have opted to exclude the scores, while some plan to issue their report cards late.

The flurry of end-of-school adjustments has left local administrators to explain the changes to parents, educators and students who are already wary of state testing. And the issue has put Tennessee education officials back on the defensive as the state works to regain its footing on testing after last year’s high-profile setbacks.

“We just need to get more crisp as a state,” said Superintendent Dorsey Hopson after Shelby County Schools joined the growing list of districts opting to leave out the scores. “If we know that we want to use (TNReady scores), if the state says use them on the report card, then we got to get them back.”

The confusion represents one step back for TNReady, even after the state took two steps forward this spring with a mostly smooth second year of testing under Questar, its new test maker. Last year, McQueen canceled testing for grades 3-8 and fired Measurement Inc. after Tennessee’s online platform failed and a string of logistical problems ensued.


Why TNReady’s failed rollout leaves Tennessee with challenges for years to come


But the reason this year’s testing went more smoothly may also be the reason why the scores haven’t arrived early enough for many districts.

TNReady was mostly administered on paper this time around, which meant materials had to be processed, shipped and scored before the early data could be shared with districts. About 600,000 students took the assessment statewide.

After testing ended on May 5, districts had five days to get their materials to Questar to go to the front of the line for return of preliminary scores. Not all districts succeeded, and some had problems with shipping. Through it all, the State Department of Education has maintained that its timelines are “on track.”

McQueen said Wednesday that districts have authority under a 2015 state law to exclude the scores from students’ final grades if the data doesn’t arrive a week before school lets out. And with 146 districts that set their own calendars, “the flexibility provided under this law is very important.”

Next year will be better, she says, as Tennessee moves more students to online testing, beginning with high school students.

PHOTO: TN.gov
Candice McQueen

“We lose seven to 10 days for potential scoring time just due to shipping and delivery,” she said of paper tests. “Online, those challenges are eliminated because the materials can be uploaded immediately and transferred much much quicker.”

The commissioner emphasized that the data that matters most is not the preliminary data but the final score reports, which are scheduled for release in July for high schools and the fall for grades 3-8. Those scores are factored into teachers’ evaluations and are also used to measure the effectiveness of schools and districts. 

“Not until you get the score report will you have the full context of a student’s performance level and strengths and weaknesses in relation to the standards,” she said.

The early data matters to districts, though, since Tennessee has tied the scores to student grades since 2011.

“Historically, we know that students don’t try as hard when the tests don’t count,” said Jennifer Johnson, a spokeswoman for Wilson County Schools, a district outside of Nashville that opted to issue report cards late. “We’re trying to get our students into the mindset that tests do matter, that this means business.”

Regardless, this year’s handling of early scores has left many parents and educators confused, some even exasperated.

“There’s so much time and stress on students, and here again it’s not ready,” said Tikeila Rucker, a Memphis teacher who is president of the United Education Association of Shelby County.

“The expectation is that we would have the scores back,” Hopson agreed.

But Hopson, who heads Tennessee’s largest district in Memphis, also is taking the long view.

“It’s a new test and a new process and I’m sure the state is trying to figure it all out,” he said. “Obviously the process was better this year than last year.”

Laura Faith Kebede and Caroline Bauman contributed to this report.

Not Ready

Memphis students won’t see TNReady scores reflected in their final report cards

PHOTO: Creative Commons / timlewisnm

Shelby County Schools has joined the growing list of Tennessee districts that won’t factor preliminary state test scores into students’ final grades this year.

The state’s largest school district didn’t receive raw score data in time, a district spokeswoman said Tuesday.

The State Department of Education began sharing the preliminary scores this week, too late in the school year for many districts letting out in the same week. That includes Shelby County Schools, which dismisses students on Friday.

While a state spokeswoman said the timelines are “on track,” Superintendent Dorsey Hopson said the timing was unfortunate.

“There’s a lot of discussion about too many tests, and I think anytime you have a situation where you advertise the tests are going to be used for one thing and then we don’t get the data back, it becomes frustrating for students and families. But that’s not in our control,” he said Tuesday night.

Hopson added that the preliminary scores will still get used eventually, but just not in students’ final grades. “As we get the data and as we think about our strategy, we’ll just make adjustments and try to use them appropriately,” he said.

The decision means that all four of Tennessee’s urban districts in Memphis, Nashville, Knoxville and Chattanooga won’t include TNReady in all of their students’ final grades. Other school systems, such as in Williamson and Wilson counties, plan to make allowances by issuing report cards late, and Knox County will do the same for its high school students.

Under a 2015 state law, districts can leave out standardized test scores if the information doesn’t arrive five instructional days before the end of the school year. This year, TNReady is supposed to count for 10 percent of final grades.

Also known as “quick scores,” the data is different from the final test scores that will be part of teachers’ evaluation scores. The state expects to release final scores for high schoolers in July and for grades 3-8 in the fall.

The Department of Education has been working with testing company Questar to gather and score TNReady since the state’s testing window ended on May 5. About 600,000 students took the assessment statewide in grades 3-11.

State officials could not provide a district-by-district listing of when districts will receive their scores.

“Scores will continue to come out on a rolling basis, with new data released every day, and districts will receive scores based on their timely return of testing materials and their completion of the data entry process,” spokeswoman Sara Gast told Chalkbeat on Monday. “Based on district feedback, we have prioritized returning end-of-course data to districts first.”

Caroline Bauman and Laura Faith Kebede contributed to this report.