Data is objective. Isn’t it?

In front of a packed auditorium at Madison Middle Prep last week,  Metropolitan Nashville Public Schools board member Jill Speering presented data to prove a point: the middle school should not be taken over by the Achievement School District.

Her voice trembling with emotion, she directed the numbers at Chris Barbic, the superintendent of the statewide school district. She started her speech by comparing Brick Church College Prep, a current ASD school, to Metro Schools overall.

She said that Brick Church wasn’t as extraordinary as Barbic asserted, according to state growth data, called the Tennessee Value Added Assessment System, or TVAAS.

“I don’t know what you’re talking about,” Barbic replied. “They were level five TVAAS [the highest score for growth].”

He then chastised her, causing the crowd to boo: “You’re a school board member. You should know how TVAAS works.”

In the wake of the ASD’s announcement that it will take over one of two middle schools in a Nashville community, district officials and their opponents have continually cited data points that seem contradictory. Both sides have accused the other of playing fast and loose with numbers, and even of lying.

But in fact, representatives from both sides of the school takeover debate were using different yardsticks to measure the same things: the growth of student test scores at the Nashville middle schools up for take over; the schools run by LEAD Public Schools (a charter operator that will manage the taken over school); and the ASD as a whole.

The ASD was created to transform Tennessee’s bottom 5 percent of schools into the top 25 percent by 2020. For the first time, both Madison Middle Prep and Neely’s Bend Middle Prep in Nashville were in the bottom 5 percent of schools this year, making them eligible for state intervention.

Opponents of intervention, led by school board members Speering and Amy Frogge, say that in fact, Madison and Neely’s Bend’s students are improving at comparable rates to LEAD’s schools and other schools in the ASD. Therefore, they say, takeover isn’t necessary, and might even be counterproductive.

ASD officials say that LEAD’s schools are improving at much higher rates than Madison or Neely’s Bend.

So what explains this glaring discrepancy?

To make decisions about a school, including whether it is eligible for ASD takeover, the Tennessee Department of Education takes into account schoolwide passing rates on the state’s standardized test, and TVAAS ratings.

TVAAS is a  way to measure if students at a school are making more or less growth than their peers across the state. (You can read more about the controversy around TVAAS here).  In order to compute it, the state Department of Education calculates a number that shows the growth of students at a school compared to students statewide.

Let’s call that the raw growth score. It’s the first of two steps the state uses in calculating TVAAS ratings.

But those were the numbers that Frogge and Speering cited. According to composite raw growth scores, that took into account test scores in math, reading, science and social studies, both Neely’s Bend and Madison had slight negative two-year a growth scores of -1, and -2.5 respectively.  The composite growth score for Brick Church College Prep, a LEAD school in the ASD that ASD officials tout as a model for their next Nashville take over, was slightly positive at .4.

Those raw numbers don’t suggest that Brick Church is improving its students scores much more than the traditional district schools are. If you break out raw scores for individual subjects, Brick Church sometimes fares worse than the schools up for take over. For example, in reading, its raw growth score for two years was -3.7, while Madison and Neely’s Bend had raw growth scores of -2.8 and -1.5, respectively.

But to get a final TVAAS rating for a school, there’s another step: the state divides that raw growth score by a standard error. Paul Changas, the director of research for Metro Schools, says the standard error takes a variety of factors into account, but that the smaller the student population at a school, the larger the standard error calculation.

“Random factors such as student focus, distractability, energy level, emotional state, or text complexity can be fairly significant when only a few students are involved,” he said. “But as the size of the group gets larger, these random factors tend to balance out – for every student having a bad day taking the typing test there is more likely to be a student having a good day.”

Source: TN.gov

Standard error also takes into account the variability in scores at a school, said Ashley Ball, a spokesperson for the Department of Education. Decision-makers on the state level want the data to tell them a clear story about what’s going on in a school, she said.

“If most of the students are exceeding expectations it tells us, wow, this is a really clear story,” she said. “If students are all across the board, you’re going to have a bigger standard error to account for the fact that the story isn’t as clear.”The number schools get after their raw score is divided by the standard error is called a “growth index.” From that, the state assigns a school a grade from levels one to five, five being the best.

Source: Paul Changas, TN.gov

Since Brick Church is a phase-in school, it has expanded one grade at a time, and the student body is still considerably smaller than other middle schools. Therefore, the school has a higher standard of error, and its growth index looks pretty different from its raw score.

The ASD has built its case for takeover on a variety of factors, among them the growth index. Brick Church’s growth index is higher than Neely’s Bend or Madison’s. Brick Church’s most recent overall growth index is a Level 5, and the Madison and Neely’s Bend are Level 1s — putting them on the hot seat for take over.

So which measurement of a schools’ success is more valid? Opinions differ. Metropolitan Nashville Public Schools officials typically go with the raw score.

“While the growth index [the numbers the ASD are citing] is important to determining how confident we are that growth exceeds a target, it is not the most meaningful scale for measurement [of] the amount of growth that has occurred,” Changas said.

But the state Department of Education disagrees, as does the ASD.

“You’re going to want to factor in standard error, because it is a buffer and a protective measure,” state education department spokeswoman Ball said. “We want to have the most accurate data possible.”

You can see Paul Changas’s data here, and check out the state’s interpretation of the same numbers at the TVAAS website.

Clarification:  The growth index is one of many factors the ASD considers when taking over schools.  An earlier version of the story described the growth index as the predominant factor.