A fairer way to judge high schools? This state is trying to find out which schools really help students graduate.

A high school graduates only 65% of its students. Does that mean it’s a bad school?

Most would assume yes. In fact, federal law requires that high schools with graduation rates below 67% be flagged as low-performing.

But what if that high school serves only students who were unlikely to graduate at all — who arrive far below grade level and on the verge of dropping out? Is that school actually doing a worse job than the school with a 90% graduation rate serving mostly affluent students?

In a new report, researchers argue for a different, more useful way to judge high schools that accounts for those factors. They crafted it using data from Louisiana’s high schools, and an official at the Louisiana education department says the state is interested in the new approach, though no plans have been made.

“We say to the system, you’re responsible for making sure kids are successful, and at least in Louisiana, we don’t do enough right now to equip our principals to be able to understand and reflect on that,” said Jessica Baghian, an assistant superintendent. “It is our job to fix that.”

The new approach finds that some schools are doing much better than others, and the schools doing the best aren’t more likely to be in wealthier areas. But rating and ranking schools quickly veers into fraught territory — likely why Louisiana is focusing on how the new data could be used to inform schools, rather than sanction them.

“We really want to do case studies with those schools to understand better, what’s the secret sauce there? What’s happening? What can we learn from that?” Baghian said.

Researchers try to disentangle demographics and high school quality

Concerns about the way we measure school performance are longstanding, since most methods rely on test scores, which closely track with student poverty rates.

Researchers have developed complicated fixes known as “value added” metrics. Those try to isolate the impact of individual schools and teachers, and are used mostly in elementary and middle schools. Value-added metrics proved controversial when they were incorporated into high-stakes teacher evaluations in many states.

The method isn’t used much for high schools, where annual testing is not required and a key metric — high school graduation — can’t be measured through growth. (That’s because a student cannot show “growth” in high school graduation; they either graduate or they don’t.)

A fundamental problem still exists, though. “A simple comparison of the college enrollment rates of students from a high-income suburban high school and students from a low-income rural high school, for example, cannot tell us whether one school is doing better at promoting college enrollment for the students it serves,” write the researchers with the firm Mathematica.

That’s where the new approach comes in. The researchers created a way to estimate how much schools contribute to students’ chances of graduating high school and enrolling and persisting in college. (The report was commissioned by the Louisiana Department of Education with funding from the Walton Family Foundation. Walton is also a funder of Chalkbeat.)

To do so, they control for a number of factors, including students’ eighth-grade test scores and attendance as well as whether they come from a low-income family, have a disability, or have limited English proficiency. The idea is to hold constant factors outside of schools’ control — though the study may not be able to account for all such factors.

Then the researchers predict a high school’s graduation and college attendance rate based on the students they serve. A school that hits that target would be average, while schools that do much better would be substantially above average.

In Louisiana, a student at the average high school has a 75% chance of graduating. But if a student goes to an exceptional school, their chance jumps to 89%, the researchers found. Similarly, college enrollment goes from 46% at an average school to 59% at an exceptional one. The researchers refer to this impact as a school’s “promotion power.”

A school’s score usually doesn’t shift much from year to year, and there is little if any correlation between schools’ poverty rates and their “promotion power.”

“I’m not saying value-added is perfect, but it’s certainly a better measure than than the raw one,” said Kirabo Jackson, an education researcher at Northwestern University who recently examined how Chicago high schools affected students’ test scores and social-emotional skills. “I think the attempt is laudable.”

The Mathematica researchers aren’t the only ones thinking about this issue. The Urban Institute released a report in January that uses a similar approach to calculate how high schools in three states affect students’ futures. Their study, too, found that schools with similar demographics varied in how much they helped students graduate and enroll in college.

How can these new measures be of use to schools and districts?

Baghian, of Louisiana’s education department, said one option is to simply provide the results to principals.

“I want to give our high school leaders better information than they have — and that I believe almost anybody in the country has — about what’s happening when kids leave,” she said, noting that state officials are just starting to dig into the findings.

Districts might consider trying to “learn from what’s happening in the school that’s producing these better outcomes and share it with the school that isn’t doing so well,” said Constance Lindsay, co-author of of the Urban Institute study.

That might not be controversial. But other potential uses would be, like making “promotion power” part of a state’s school accountability system. Federal law requires states to include overall high school graduation rates in their systems, though a “promotion power”-style measure could theoretically be used in addition. Baghian says there are no immediate plans to do so in Louisiana.

But if this does become a possibility, some might worry that these measures effectively lower expectations for schools serving high-needs students and reduce their perceived need for extra support. A school with a 65% graduation rate but high “promotion power” is still not graduating a third of its students.

“There’s this concern that if you go too far on growth you’re congratulating mediocrity, and you’re lowering the bar for certain kids,” said Baghian. “I think that is real. We have to keep an eye on that.”

Meanwhile, even talking about “value-added” may have some teachers skittish. There was fierce backlash to the use of value-added measures for teacher evaluation, and a number of states have scrapped them.

Phil Weinberg, a former high school principal and top New York City schools official, said that he is intrigued by the idea in the Louisiana paper, but warns against simply ranking schools.

A measure like this, he said, could be used “as a jumping off point for what can we accelerate to be better, what’s working, what might not be bringing value to students.”

“If it’s not used as a tool for learning,” he said, “its value decreases a great deal.”