Bumpy runway

Emails reveal months of missteps leading up to Tennessee’s disastrous online testing debut

PHOTO: Chalkbeat Photo Illustration

Tennessee education officials allowed students and teachers to go ahead with a new online testing system that had failed repeatedly in classrooms across the state, according to emails obtained by Chalkbeat.

After local districts spent millions of dollars on new computers, iPads, and upgraded internet service, teachers and students practiced for months taking the tests using MIST, an online testing system run by North Carolina-based test maker Measurement Inc.

They encountered myriad problems: Sometimes, the test questions took three minutes each to load, or wouldn’t load at all. At other times, the test wouldn’t work on iPads. And in some cases, the system even saved the wrong answers.

When students in McMinnville, a town southeast of Nashville, logged on to take their practice tests, they found some questions already filled in — incorrectly — and that they couldn’t change the answers. The unsettling implication: Even if students could take the exam, the scores would not reflect their skills.

“That is a HUGE issue to me,” Warren County High School assistant principal Penny Shockley wrote to Measurement Inc.

Tennessee Education Commissioner Candice McQueen speaks with reporters in February about technical problems with the state's new online assessment.
PHOTO: Grace Tatter
Tennessee Education Commissioner Candice McQueen speaks with reporters in February about technical problems with the state’s new online assessment.

The emails contain numerous alarming reports about practice tests gone awry. They also show that miscommunication between officials with the Tennessee Department of Education and Measurement Inc. made it difficult to fix problems in time for launch.

And they suggest that even as problems continued to emerge as the test date neared, state officials either failed to understand or downplayed the widespread nature of the problems to schools. As a result, district leaders who could have chosen to have students take the test on paper instead moved forward with the online system.

The messages span from October until Feb. 10, two days after the online test’s debut and cancellation hours later. Together, they offer a peek into how Tennessee wound up with a worst-case scenario: countless hours wasted by teachers and students preparing for tests that could not be taken.

October: ‘Frustration … is definitely peaking’

Leaders with the Education Department, local districts and Measurement Inc. all knew that Tennessee’s transition to online tests wouldn’t be easy. So the test maker and the department developed a plan to identify weaknesses: stress tests they called “Break MIST” to tax and troubleshoot the online system.

They all had a lot riding on a smooth rollout. Tennessee was counting on the scores to assess whether students are measuring up to new and more challenging standards, to evaluate teachers, and to decide which schools to close. Districts, even the most cash-strapped, had invested millions of dollars on new technology. And Measurement Inc., a small company headquartered in Durham, was looking to prove that it belonged in the multibillion-dollar testing industry’s top tier.

The first “Break MIST” day on Oct. 1 was a mess — as expected. Students in the eastern part of the state logged on without issue, but the system stumbled as the majority of students started their tests an hour later.

That morning, emails show that Measurement Inc. received 105 calls reporting problems. The company noted particular problems in districts using iPads. Officials from the testing company assured the state that the bugs could be fixed, and the education department passed the message on to the public.

Department officials said nearly 1.5 million practice tests were completed successfully over the course of the fall. But emails show that even on days that weren’t meant to tax the system, problems emerged.

On Oct. 20, students in some districts were taking practice tests when “everything quit,” according to a state official who summarized complaints that local technology coordinators were swapping by email.

“Not very reassuring,” wrote Randy Damewood, the IT coordinator in Coffee County.

“Not good news,” agreed John Payne, director of technology for Kingsport City Schools, who suggested that his own district’s tests were working that day.

“The frustration among teachers and central office staff is definitely peaking,” wrote Eric Brown, a state official.

But there was more frustration to come, much of it behind the scenes at the Education Department.

December to January: Communication falters

Even after Measurement Inc. and department officials worked together to address problems during practice tests, the department still wasn’t confident in the online system. They weren’t sure whether problems were due to local infrastructure or something bigger. Officials planned two more “Break MIST” days in January to find out.

But they didn’t involve Measurement Inc. in the planning, at least according to company officials who wrote to the department to say they learned of those plans only after being copied on an email sent to local superintendents by Education Commissioner Candice McQueen.

That message was one of many in which officials with the state or the testing company expressed frustration about communication in the weeks leading up to the testing period.

One tense exchange dealt with the problems faced by students taking practice tests on iPads. “Will the iPad platform be ready for primetime in the spring?” Assistant Commissioner Nakia Towns asked Measurement Inc. officials on Dec. 3. “I feel like we need to be honest on this one.”

The test maker did not email a response, and Towns raised the issue again a month later and indicated that she was still waiting for an answer. “I had asked the question very directly in December,” she wrote Measurement Inc. on Jan. 6. “We urgently need an update.”

It took five more days, until Jan. 11, for her to get an answer. A reply from a Measurement Inc. testing expert blamed the problem on Apple but suggested the company had a “workaround.”

The next day, 504 students in Dyer County, about 80 miles north of Memphis, attempted to take the exam, many of them using iPads. Not one was able to complete the test because questions took too long to load, according to a report from Measurement Inc.’s call center. (Another half-million tests were completed successfully during January, according to department officials.)

Henry Scherich
Henry Scherich

In an interview this week, McQueen told Chalkbeat that Measurement Inc. never fixed the iPad problem and that state officials called Apple themselves looking for a solution. She was still looking for an answer on Jan. 21, when she tried to speak directly with Measurement Inc. President Harry Scherich.

“She is wondering if there is any way for you to find even 15 minutes today for a call,” McQueen’s chief of staff wrote. “Commissioner will make herself available. We need to speak to someone who would be able to make a decision concerning technology in an effort to get communication to directors of schools today.”

Scherich, who was in Michigan meeting with that state’s education department, initially said he did not have time to speak with McQueen. (Measurement Inc. is one of two companies producing Michigan’s new exam.) Later that day, he agreed to speak.

McQueenEmail

McQueen said she and her team came to a conclusion the next day: The test wouldn’t work on iPads. They emailed and called districts that had purchased tablets for testing and recommended a switch to paper.

February: A last-minute warning gets too little attention

Even as tensions mounted and glitches piled up, both the department and Measurement Inc. projected confidence about what would happen on Feb. 8, when the test would go live for most Tennessee schools. State officials even invited reporters to Department of Education offices on Feb. 3 to say they were optimistic about the rollout.

But behind the scenes, they were preparing for the worst. McQueen asked the test maker’s call centers to prepare for a major outage, something a Measurement Inc. employee told her was “very unlikely.”

She also emailed districts telling them they should consider switching to paper tests if their students were waiting too long for questions to load. She gave them three days to decide.

Just 15 of Tennessee’s nearly 150 districts took her up on the offer, McQueen told Chalkbeat.

But emails show that the state knew that most districts were having difficulties. When one district’s technology coordinator asked the state for a list of districts ready for the online exam, officials came up short.

“I don’t think I can answer that with any confidence,” the department’s top technology officer wrote.

Five days later, on Monday, Feb. 8, the test officially began. Again, the system handled the first set of test takers but broke down when the rest of the state’s students logged on.

As students stopped being able to connect or saw their tests freeze, emails show that technology directors began frantically contacting each other.

“Has anyone else had MIST drop out on them?” the director from Houston County Schools asked. A chorus of technology directors from other districts replied in the affirmative.

Within hours, Tennessee had ended its foray into online testing. First, McQueen told districts to suspend the exams, then directed them to give up on the online platform altogether.

“We are not confident in the system’s ability to perform consistently,” she wrote in an email to school superintendents that afternoon.

McQueen told Chalkbeat that officials started the day “in good faith,” with an assumption that Measurement Inc. had resolved problems adequately. Scherich told Chalkbeat that he’s still unconvinced that the problems were the company’s fault. He suggested that Tennessee’s decision to cancel testing came too soon.

Either way, the department’s top technology official put it simply when he emailed McQueen on the day of the failure. “It appears that greater procedural and operational rigor could have prevented the network outage,” Cliff Lloyd wrote to McQueen.

The debacle was just what Ravi Gupta, the CEO of a Nashville-based charter school, was worried about when he pressed the state in January for more transparency about the status of the online platform.

“It would be a betrayal of our students’ hard work if adult technical failures stood in the way of their success,” Gupta wrote to McQueen.

In the end, that’s exactly what happened.

Clarification (June 28, 2016): This story has also been revised to clarify the impact of the department’s communications on district testing decisions. It has also been updated to include new information about successful practice tests.

School choice

Secret CPS report spotlights big vacancies, lopsided options for students

The school district says the report will help inform how it invests in and engages with communities. Communities groups worry the document will be used to justify more school closings, turnarounds and charters.

An unreleased report by a school choice group backed by the business community paints in stark detail what many Chicagoans have known for years: that top academic schools are clustered in wealthier neighborhoods, and that fewer black and Latino students have access to those schools.

The report highlights startling figures: About 27 percent of black students are in the district’s lowest-rated schools, compared with 8 percent of Latino students and 3 percent of whites. It also says that while Chicago Public Schools has more than 150,000 unfilled seats, 40 percent, or 60,000 of them, are at top-ranked schools. That surplus will grow as enrollment, which has been plummeting for years, is projected to decline further by 5.1 percent over the next three years. What that means is the cash-strapped district is moving toward having nearly one extra seat for every two of its students.

The document effectively shows that, in many areas of the city, students are skipping out on nearby options, with less than half of district students attending their designated neighborhood schools.

In a city still reeling from the largest mass school closure in U.S. history, this report could lay groundwork for another round of  difficult decisions.

The “Annual Regional Analysis” report, compiled by the group Kids First Chicago on CPS’ behalf, has been circulating among select community groups but has not been made public. It comes on the heels of a report showing students’ high school preferences vary with family income level. Students from low-income neighborhoods submit more applications than students from wealthier ones and apply in greater numbers for the district’s charter high schools.

The group behind the latest report has had many iterations: Kids First is a new name, but its origins date back to 2004, when it started as the charter fundraising group Renaissance Schools Fund. That was during the Renaissance 2010 effort, which seeded 100 new schools across the city, including many charters. The group changed its name to New Schools Chicago in 2011 and again rebranded this year as Kids First, with a greater focus on parent engagement and policy advocacy.

The report has caused a stir among some community groups who’ve seen it. Because the school district has used enrollment figures to justify closing schools, some people are worried it could be used to propose more closings, turnarounds, and charter schools.

“To me this is the new reason [for school closings],” said Carolina Gaeta, co-director of community group Blocks Together, which supports neighborhood schools. “Before it was academics, then it was utilization, now it’s going to be access and equity. Numbers can be used any way.”

In a statement on the report, Chicago Teachers Union Spokeswoman Christine Geovanis blasted Mayor Rahm Emanuel’s administration for policies that she alleged “undermine enrollment at neighborhood schools,” such as the proliferation of charter schools, school budget cuts, and building new schools over the objection of community members.

Reached by phone Thursday, Kids First CEO Daniel Anello confirmed that his organization helped put the report together, but declined to comment on its contents, deferring to the district. CPS Spokeswoman Emily Bolton acknowledged the report’s existence in a statement emailed to Chalkbeat Chicago that said the school district “is having conversations with communities to get input and inform decisions” about where to place particular academic programs. The statement said CPS is still in the process of drafting a final version of the document, but gave no timetable. Mayor Rahm Emanuel’s office didn’t grant requests for interviews about the Annual Regional Analysis.

Below is a preview of the report provided to Chalkbeat Chicago.

Gaps in access to arts and IB programs

Data released this week from the district’s GoCPS universal high school application clearly shows what academic programs are most in demand: selective enrollment programs that require children to test in;  arts programs; and career and technical education offerings, or CTE.

The Kids First’s analysis puts those findings into context, however, by detailing how supply is geographically uneven, especially when it comes to arts. Maps in the report divide the city into regions defined by the city’s planning department and show how highly-desirable arts programs are not spread equally throughout the city, and are most concentrated along the northern lakefront and downtown.

PHOTO: Sam Park
This map shows the number of fine & performing arts program seats available per 100 elementary school students in each planning area.

Worse, four regions offer 10 or fewer arts seats per 100 students, including the Bronzeville/South Lakefront region that includes neighborhoods such as South Shore, Woodlawn, Kenwood and Hyde Park. They are also scarce in the West Side region, which includes Austin, North Lawndale, and Humboldt Park and in the Northwest neighborhoods of Belmont Cragin, Dunning, and Portage Park.

The report also shows an imbalance in the number of rigorous International Baccalaureate programs.

This map shows the number of IB program seats per 100 students available to elementary and high school students in each planning area.

The highest number of IB seats are in the wealthy, predominately white Lincoln Park area. In contrast, there are far fewer IB seats in predominantly black communities such as  Englewood and Auburn Gresham, Ashburn and in the predominantly Latino Back of the Yards.

When it comes to selective-enrollment elementary school programs such as gifted centers and classical schools, which require students to pass entrance exams, options tend to be concentrated, too, with fewer choices on the South and West sides of the city. This map shows where selective enrollment high school options are most prevalent:

PHOTO: Sam Park
This map shows the number of selective enrollment high school seats available per 100 students in the city’s planning regions.

STEM programs are more evenly distributed across Chicago than both IB and selective enrollment schools, yet whole swaths of the city lack them, especially on the South Side, including the Greater Stony Island. As the other maps show, that region lacks most of the high-demand academic programs the district has to offer.

PHOTO: Sam Park
This map shows the number of STEM program seats available per 100 elementary school students.

Racial disparities in school quality

The analysis also shows disparities in quality of schools, not just variety.

At CPS, 65 percent of students districtwide are enrolled at Level 1-plus or Level 1-rated schools. But only 45 percent of black students and 72 percent of Latino students are in those top-rated seats, compared with 91 percent of white students.

The disparities are even more severe given that the school district is mostly Latino and black, with fewer than one in 10 students identified as white. 

A page from a presentation of the Annual Regional Analysis showed to select community groups.

In the Greater Lincoln Park region, 100 percent of elementary schools have one of the top two ratings — the highest concentration of them in the city.  The highest concentration of top-rated high school seats, 91 percent, is in the Central Area, which includes Downtown and the South Loop.

The lowest concentration of top-rated elementary seats, 35 percent, is in the Near West Side region, and the lowest concentration of high school seats, 14 percent, is in the West Side region.

Long commutes from some neighborhoods

The number of students choosing schools outside their neighborhood boundaries has increased in recent years.

But the report shows that school choice varies by race: 44 percent of black students attend their neighborhood elementary school, compared with 67 percent of Latino students, 69 percent of white students, and 66 percent of Asian students. For high schoolers, only 14 percent of black students attend their neighborhood school, compared with 28 percent of Asians, 30 percent of Latinos, and 32 percent of whites.

More students enrolling outside their neighborhood attendance boundaries means more and more students have longer commutes, but how far they travel depends on their address. 

Again, this is an area where the Greater Stony Island area stands out.

A graphic from the Annual Regional Analysis executive report that shows how far elementary school students in each of the city’s 16 planning regions travel from their homes to school. The data shows that students on the South and West Sides tend to have longer commutes.

The average distance traveled for elementary school students is 1.5 miles — but K-8 students in Greater Stony Island travel an average of 2.6 miles. The average distance to class for high schoolers citywide is 2.6 miles, but students in the Greater Stony Island region travel an average of 5 miles, about twice the city average. 

A graphic from the Annual Regional Analysis executive report that shows how far high school students in each of the city’s 16 planning regions travel from their homes to school. The data shows that students on the South and West Sides tend to have longer commutes.

Looking forward

The introduction to the Annual Regional Analysis describes it as “a common fact base” to understand the school landscape. It clearly states the intent of the report is to assist with district planning, not to provide recommendations.

It still bothers Wendy Katten, founder of Raise Your Hand, who has seen the report.

“It sounds like some data a company would use to reduce inventory at a manufacturing plant,” she said.

Gaete with Blocks Together said the numbers in the report are missing important context about how the proliferation of charter schools, a lack of transparent and equitable planning, and a lack of support for neighborhood schools in recent decades has exacerbated school quality disparities across race and neighborhoods in Chicago, one of the nation’s most diverse but segregated cities.

It’s unclear when the final study will be published, or how exactly the school district will use its contents to inform its decisions and conversations with communities.

But an event posting on the website for Forefront, a membership association for “nonprofits, grantmakers, public agencies, advisors, and our allies,” mentions a briefing for the report on Oct. 10.

Kids First Chicago CEO Dan Anello and CPS Director of Strategy Sadie Stockdale Jefferson will share the report there, according to the website.

state test results

With accelerated growth in literacy and math, Denver students close in on state averages

Angel Trigueros-Martinez pokes his head from the back of the line as students wait to enter the building on the first day of school at McGlone Academy on Wednesday. (Photo by AAron Ontiveroz/The Denver Post)

Denver elementary and middle school students continued a recent streak of high academic growth this year on state literacy and math tests, results released Thursday show. That growth inched the district’s scores even closer to statewide averages, turning what was once a wide chasm into a narrow gap of 2 percentage points in math and 3 in literacy.

Still, fewer than half of Denver students in grades three through eight met state expectations in literacy, and only about a third met them in math.

Find your school’s test scores
Look up your elementary or middle school’s test scores in Chalkbeat’s database here. Look up your high school’s test results here.

Denver’s high schoolers lagged in academic growth, especially ninth-graders who took the PSAT for the first time. Their test scores were lower than statewide averages.

“We are absolutely concerned about that,” Superintendent Tom Boasberg said Thursday of the ninth-grade scores, “and that is data we need to dig in on and understand.”

Students across Colorado took standardized literacy and math tests this past spring. Third- through eighth-graders took the Colorado Measures of Academic Success, or CMAS, tests, which are also known as the PARCC tests. High school students took college entrance exams: Ninth- and 10th-graders took the PSAT, a preparatory test, and 11th-graders took the SAT.

On CMAS, 42 percent of Denver students in grades three through eight met or exceeded state expectations in literacy. Statewide, 45 percent of students did. In math, 32 percent of Denver students met expectations, compared with 34 percent statewide.

While Denver’s overall performance improved in both subjects, third-grade literacy scores were flat. That’s noteworthy because the district has invested heavily in early literacy training for teachers and has seen progress on tests taken by students in kindergarten through third grade. That wasn’t reflected on the third-grade CMAS test, though Boasberg said he’s hopeful it will be as more students meant to benefit from the training take that test.

On the PSAT tests, Denver ninth-graders earned a mean score of 860, which was below the statewide mean score of 902. The mean PSAT score for Denver 10th-graders was 912, compared with the statewide mean score of 944. And on the SAT, Denver 11th-graders had a mean score of 975. Statewide, the mean score for 11th-graders was 1014.

White students in Denver continued to score higher, and make more academic progress year to year, than black and Hispanic students. The same was true for students from high- and middle-income families compared with students from low-income families.

For example, 69 percent of Denver students from high- and middle-income families met expectations on the CMAS literacy tests, compared with just 27 percent of students from low-income families – which equates to a 42 percentage-point gap. That especially matters in Denver because two-thirds of the district’s 92,600 students are from low-income families.

Boasberg acknowledged those gaps, and said it is the district’s core mission to close them. But he also pointed out that Denver’s students of color and those from low-income families show more academic growth than their peers statewide. That means they’re making faster progress and are more likely to reach or surpass grade-level in reading, writing, and math.

Denver Public Schools pays a lot of attention to annual academic growth, as measured by a state calculation known as a “median growth percentile.”

The calculation assigns students a score from 1 to 99 that reflects how much they improved compared with other students with similar score histories. A score of 99 means a student did better on the test than 99 percent of students who scored similarly to him the year before.

Students who score above 50 are considered to have made more than a year’s worth of academic progress in a year’s time, whereas students who score below 50 are considered to have made less than a year’s worth of progress.

The state also calculates overall growth scores for districts and schools. Denver Public Schools earned a growth score of 55 on the CMAS literacy tests and 54 on the CMAS math tests. Combined, those scores were the highest among Colorado’s 12 largest districts.

Other bright spots in the district’s data: Denver’s students learning English as a second language – who make up more than a third of the population – continued to outpace statewide averages in achievement. For example, 29 percent of Denver’s English language learners met expectations in literacy, while only 22 percent statewide did, according to the district.

Denver eighth-graders also surpassed statewide averages in literacy for the first time this year: 45 percent met or exceeded expectations, as opposed to 44 percent statewide. That increase is reflected in the high growth scores for Denver eighth-graders: 52 in math and 57 in literacy.

Those contrast sharply with the ninth-grade growth scores: 47 in math and an especially low 37 in literacy. That same group of students had higher growth scores last year, Boasberg said; why their progress dropped so precipitously is part of what district officials hope to figure out.