Some analysis left undone in data-driven education department

The Department of Education crunches state test scores in dozens of ways to measure the performance of schools, principals, teachers, and students. But it does not perform a statistical analysis that can reveal whether an elementary school’s graduates have received test scores that far outstrip their actual skills.

Researchers say it would be relatively easy for the department to calculate “swing rates” to find the proportion of students from each school whose scores rise or fall by a statistically unlikely margin when they move to another school. Such an analysis could take some of the burden off of individual educators to report suspicions of cheating.

The city used to conduct swing rate analysis prior to the Bloomberg administration, according to a former testing official, and the state is poised to launch the measure as part of an overhaul of its own approach to test security.

But department officials say the analysis would be too onerous. They also say that they never launch investigations into cheating based on data anomalies alone. Instead, they say they will dispatch investigators only when they receive formal allegations of test improprieties.

The policy means that some top-rated schools whose students’ scores plummet at far higher than the average rate never have their testing practices scrutinized.

For all of the criticism of state tests as being arbitrary and imperfect measures of student performance, they are remarkably stable. In 2011, students who saw their scores fall by more than two standard deviations from the previous year made up just 0.6 percent of the sixth grade test-taking population in English, and 0.4 percent in math. That degree of decline is highly improbable under normal circumstances and is more likely to reflect externalities than real changes in academic proficiency.If one student’s test score plummets to that degree, it might be reasonable to conclude that he had a bad day when the second test was administered. But if an entire cohort of students see their scores plummet, it could be that testing conditions were especially favorable in the first year, maybe illicitly.

Some city schools have posted swing rates many times the 0.6 percent average, according to the New York Times. At the two elementary schools with the highest scores on the city’s 2011 progress reports, P.S. 257 and P.S. 31 in Brooklyn, the swing rates that year were 9 percent and 1.8 percent, respectively.

Neither school raised eyebrows at the department until a whistleblower at a middle school that received their students registered an official complaint about wide discrepancies between the students’ test scores and their actual skills. The allegation triggered investigations at both schools.

But the Brooklyn schools’ swing rates were not even the highest in the city that year. About 30 percent of the 34 students who graduated from P.S. 199 in the South Bronx and went on to a middle school up the road, I.S. 303, saw their test scores drop by more than two standard deviations. P.S. 199 had the 16th highest progress report score that year.

At I.S. 232, a nearby middle school that absorbed 29 students from P.S. 199, the swing rate was more in line with the city average.

But teachers at I.S. 303 said that the high test scores did not correlate to the basic English and math skills that many of the incoming P.S. 199 students demonstrated early on in sixth grade.

“You don’t just forget everything,” said one math teacher. “It just baffled me that they somehow got 3’s and 4’s” in fifth grade.

Multiple teachers at the school said students’ behavior during the sixth-grade state tests suggested that the students had received help during tests before.

Some of the students from P.S. 199 grew frustrated during the tests when I.S. 303 teachers did not tell them the answers to questions that stumped them, the teachers said. The teachers agreed to speak on the condition of anonymity because their principal had not given them permission to speak to reporters.

But even after noticing the score discrepancy, no one at I.S. 303 went to the department that year. It’s a common decision, according to educators from across the city who say allegations usually bring scrutiny first to the people who filed them, sometimes exposing them as whistleblowers to their colleagues.

An investigation at P.S. 199 is now open, a department spokeswoman said. Education officials contacted investigators after GothamSchools asked repeatedly about the school’s anomalous scores.

The department’s Office of Special Investigations is handling the case now after it was referred by the city’s Office of the Special Commissioner of Investigation, a SCI spokeswoman said.

Department officials said their concerns were not based solely on the swing rate. Data points such as swing rates alone are not enough to trigger investigations.

Many of the 37 schools to which the department dispatched testing monitors last year had seen their scores increase in unusual ways. But all but four of them had also been the subject of formal allegations.

In fact, the department does not even calculate schools’ swing rates as part of its regular analysis of schools’ performance. When the accountability division runs the numbers that feed into schools’ annual progress reports, which are based largely on students’ year-to-year growth, it does not aggregate results by sending schools.

Department officials say generating schools’ swing rates is a complicated endeavor. After learning about the score discrepancies at P.S. 199, GothamSchools requested swing data for other top-rated schools. But department officials said for months that running those numbers would be too difficult.

“We have provided you with a detailed analysis of P.S. 199,” spokeswoman Marge Feinberg wrote in an email this summer. “For the other schools, it would take a great deal of time.”

Researchers who have worked with Department of Education data before disputed that claim.

“It would be quite easy to do. Just about anyone with a computer and a basic knowledge of statistics could run these checks,” said Sean Corcoran, a New York University professor who has studied the city’s test score data. “This is something the DOE should do on a regular basis.”

A former testing director in New York City said that under his watch the city used data anomalies to trigger investigations.

“We routinely would look at the change in scores in schools from one year to the next,” said Bob Tobias, the Board of Education’s longtime testing chief who retired in 2002. Schools that showed erratic spikes, Tobias said, “would get a little more scrutiny.”

There are signs that the state might start conducting this type of analysis on its own. One charge given to the brand-new test security chief Tina Sciocchetti, at the State Education Department is to set guidelines for pursuing investigations using data methods that look suspicious test score patterns.

“I think those sorts of statistical analyses are simply a red flag and it’s absolutely true that additional investigation is necessary,” said Sciocchetti.