How much learning did students really lose this spring?
This summer, education organizations offered fairly dire predictions: Thanks to widespread school closures, students would be starting the school year dramatically behind.
But newer data indicates those projections were overstated. Most students did begin this school year behind where they would have been in math, test results show. But students didn’t fall as far as some projections suggested they would, and there is not consistent evidence that low-income students or students of color fell substantially further behind their white and affluent peers.
“The loss is less than expected. But there is still a situation where we see kids not doing as well in mathematics, for sure,” said Chris Minnich, the head of NWEA, a nonprofit testing organization that released an analysis of new data Tuesday. “My headline is: better than expected, but it’s still a challenge for the country.”
That relatively good news comes with a significant asterisk. Fewer students than usual took the tests, and it’s possible that some of those “missing” students lost the most learning. The data also doesn’t tell us anything about learning during this school year, or about other kinds of harm done to students by the pandemic in the form of social isolation or failed courses.
But it is a first look at the academic effects of an unprecedented disruption in American schooling. And with the National Assessment of Educational Progress canceled this year — and state tests set to be difficult to pull off — it may be a while before there’s a more precise gauge of how the pandemic affected student achievement.
Learning loss wasn’t as bad as expected, suggesting some success of remote instruction
Soon after school buildings closed this spring, NWEA predicted that students would lose half of a year’s progress in math (10 to 20 percentile points) and 30% of a year in reading (6 to 8 points) by the fall. CREDO, an education research organization, warned that students would lose hundreds of days of learning. The consulting company McKinsey projected that students would fall behind several months.
These projections were widely cited, and in some cases discussed as if the losses had already happened. But the estimates were often based on worst-case assumptions that students would not learn anything new and actually lose past learning — treating remote instruction as an extension of the summer.
When NWEA researchers looked at the actual data from more than 4 million students in grades 3 through 8 who took its MAP test at the beginning of this school year, the results weren’t as grim.
In math, the researchers found the average student this year was 5 to 10 percentile points below the average student at the same school last year, depending on their grade. In reading, this year’s students and last year’s students scored about the same.
Renaissance, another testing company, recently found that the average elementary and middle school student fell 7 percentile points in math and 1 point in reading.
NWEA and Renaissance also found only modest differences in learning loss across different groups of students. For example, students in high-poverty schools lost 9 points in math and 2 in reading, according to the Renaissance data.
Why weren’t the results as bad as some projected? Probably because for lots of students, remote instruction wasn’t an extended summer break.
When Debi Bober, a fifth grade teacher in Long Beach, California, learned in the middle of a Friday in March that her school building would be closing, she immediately got to work.
“I pulled them all in and had a heart to heart,” she said. “We’re going to be learning online. It’s still me. It’s still our class. It’s just going to be on a computer screen.” Then she filled their backpacks with materials, and that weekend, she scrambled to pull together resources and make an introductory video. “We just went from there,” she said, and her students remained engaged throughout the spring.
What the latest data misses
It’s not clear that the new data has captured the full extent of spring learning loss, though, because an unusually large number of students were not tested this year. Even in schools that gave NWEA’s test both years, 25% of students who took the test last year didn’t take it this year. (Last year, that dropoff was just 15%.)
That could reflect enrollment declines, as some families opted for private school or homeschooling. Schools may have given the test only to certain students. Some students may have been absent when the test was given, or more fully disconnected from school.
“While there’s some good news here, we want to stress that not all students are represented in the data,” said Beth Tarasawa, NWEA’s executive vice president of research.
In addition to missing students, this data also reflects missing schools. Tests like NWEA’s MAP are voluntarily administered by schools or districts, unlike mandatory state tests, and fewer schools gave the test this year.
Plus, the latest data was collected at the beginning of this school year, so it only measures the effects of this spring and summer, when few children were receiving in-person instruction. This school year, though, students’ experiences have diverged, with students of color and low-income students much more likely to be learning fully virtually — and the parents of students attending fully, in-person school say the quality of instruction is better. All this suggests that even if test score gaps didn’t grow last school year, they may expand this year.
Finally, the data don’t capture other ways the pandemic has created challenges for students.
Melissa Dorcemus, who works at a New York City high school, said the biggest issue for students at the start of the school year wasn’t missing academic skills but transitioning to online learning.
“The conversation was never like, oh their skills are super behind, they missed out on all this content,” she said. “It was more like, get them back in some good work habits.”
The continued challenges of measuring learning loss
Getting a reliable national measurement of learning loss during this school year may continue to prove vexing. Last week, Education Secretary Betsy DeVos announced that the National Assessment of Educational Progress — a widely trusted, low-stakes federal exam — would not be administered in 2021 because of the challenges posed by the pandemic.
Top Congressional Democrats said that this underscores the need for federally required state tests.
“Today’s announcement makes 2021 administration of statewide assessments required by federal law a moral imperative,” said Sen. Patty Murray and Rep. Bobby Scott in a joint statement last week. “In the absence of NAEP and without statewide assessments, parents, educators, and policymakers would have zero data on the scope of learning loss.”
Civil rights groups have made a similar case.
Theoretically, annual state exams would provide a broader picture than optional tests like NWEA’s MAP. But state tests will likely run into similar challenges. With so many kids already not attending in-person classes currently, as the virus cases rise, it could be very difficult to get all students to take the tests.
Some are hoping that the incoming Biden administration will provide states with another waiver from annual testing requirements, arguing that testing will prove impractical and stressful. Bober, the Long Beach teacher, said there are other ways to measure student learning, and that state tests would just be another challenge in an already difficult year.
“I know that there’s a national need to track data, but is it really what’s best for our kids?” she said.