At a recent event at Stanford University, Netflix founder and education donor Reed Hastings offered a familiar refrain about educational outcomes in the U.S. “We all know the data — that for 30, 40, 50 years, it’s pretty flat,” he said. “We’ve got a mediocre, by international standards, system that’s not getting any better.”
Hastings didn’t cite any specific numbers. But there’s a good chance he had in mind one particularly troubling trend: 12th graders’ scores on federal math and reading exams have barely budged since testing began in the 1970s. Those stagnant results have worried politicians and policymakers for decades, inspiring calls to remake public education.
But what if that data isn’t telling the full story? What if we’ve been misunderstanding high school achievement for decades?
A growing number of researchers say that’s a real possibility. High school dropout rates have fallen substantially since the 1970s, which means more students who would have left high school altogether are now taking these tests, known as NAEP. Comparing scores across decades without acknowledging that, these researchers say, paints a misleadingly grim picture of the country’s progress.
“It’s a big deal,” said Kirabo Jackson, a Northwestern University education researcher. “It’s not just some weird quirky theoretical idea — no. It’s a problem.”
“There’s almost no doubt in my mind that declining dropout rates have pushed down high school NAEP scores,” said Marty West, a Harvard University professor and member of the board that oversees the exam.
It’s unclear how big of an effect the change in dropout rates is having on test scores, in part because the federal government has done little to figure this out. When questioned, federal officials have said dropout changes could affect high school scores, but downplay the likely impact.
At stake is the accuracy of the conclusions drawn from “the nation’s report card.”
Since 1971, the federal government has given periodic exams to American students in elementary, middle, and high school. The tests are administered by the National Center for Education Statistics and overseen by an independent board. (A spokesperson for the governing board referred questions to NCES.)
The NAEP scores are widely reported on, including by Chalkbeat, and touted by policymakers. The exams are sometimes referred to as the “gold standard” of student assessments, and for good reason: There are no other national tests that so carefully track the progress of American students.
But there’s no indication that officials are addressing a long-standing challenge that could be skewing our understanding of high school scores.
In 1970, 17 of every 100 teenagers and young adults had dropped out of high school, according to estimates from the Census Bureau. By 2020, the number had fallen to just 5 in 100.
This encouraging trend could obscure progress elsewhere. Students who have dropped out by the time the test is given at the end of high school, simply aren’t counted, and those students are much more likely to be low-scoring. That probably means that high school scores are always artificially inflated— but the problem was likely worse decades ago. Over time, scores might appear stagnant because high schools are doing a better job of keeping more students in school.
The 12th grade trends are “misleading,” said Andrew Ho, a Harvard professor who until recently served on the NAEP governing board. “I would essentially asterisk every grade 12 trend.”
“It’s really important not to ignore this,” said Drew Bailey, an education researcher at the University of California, Irvine.
In an interview, NCES Commissioner Peggy Carr said the agency has not attempted to estimate how dropout rate changes are affecting high school scores because it would be difficult to do so precisely.
“We think that a change in graduation rate is likely to have an impact,” she said, but suspects it’s a modest one. “That’s my educated guess.”
NCES does note that the exam is only given in schools, but it doesn’t specifically highlight the possibility that declining dropout rates are pushing down progress in 12th grade.
Some researchers think the agency needs to do more. “It’s a disservice that the NCES is doing to even report the 12th grade NAEP without these considerable caveats,” said Jackson.
Carr says the onus is on skeptics to prove this concern is significant. “Give me the results that you’re using,” she said. “I’m not denying there probably is some impact on the scores. I’m just saying, how much is it?”
No one knows for sure. NAEP does not follow individual students over time, which makes figuring this out challenging.
When Kristin Blagg, a researcher at the Urban Institute, examined why American students have made notable gains in elementary and middle school, but not high school, she couldn’t find any “smoking gun” explanation. But the data was limited, and she suspects that changing dropout rates are dragging down progress.
“It’s definitely a substantial part of the story,” said Blagg.
Her paper, published in 2016, called for NCES to collect better data to help understand stagnant high school scores.
Recently the agency announced a number of shifts to NAEP, but they do not include any efforts to understand whether changes in dropout rates are affecting scores at the end of high school.
“At a minimum, this issue deserves much more serious examination than it has received to date,” said West, the Harvard professor and NAEP board member.
Still, even if high school students have made more gains in the long-run than have been widely believed, it’s likely that schools could be improved even further. And more recently, test scores have stagnated not just in high school but in lower grades too.
“The relevant question in education policy isn’t whether we’re doing better than we were X years ago,” said West. “The relevant question is whether our school systems are meeting the challenge of the day.”
Matt Barnum is a national reporter covering education policy, politics, and research. Contact him at firstname.lastname@example.org.