In the waning days of the Obama administration, a federal study showed that its signature effort to improve struggling schools had failed to produce any clear benefits. It was a parting blow to the old administration but a welcome gift for the incoming one — and the new education secretary quickly seized on the results.
“The previous administration spent seven billion of your dollars on ‘School Improvement Grants,’ thinking they could demonstrate that money alone would solve the problem,” she said triumphantly at the Conservative Political Action Conference. “They tested their model, and it failed miserably.”
The findings don’t contradict the national analysis, but they do highlight variations in how, and how well, the program was implemented in different places. As districts and states continue to wrestle with how to improve low-performing schools, the new studies provide some evidence that improvements can happen — and relatively quickly.
“With interventions like SIG, the answer to whether it was a success or failure is almost always ‘It’s complicated,’” said Deven Carlson, as assistant professor at the University of Oklahoma and a co-author of the Ohio report. “I see our study, considered alongside other evaluations of SIG, to fully support an ‘It’s complicated’ narrative.”
A spokesperson for the education department did not respond to a request for comment.
Former Secretary of Education Arne Duncan announced the ambitious initiative to turn around hundreds of struggling schools in 2009. “We want transformation, not tinkering,” he said.
Starting in 2010, hundreds of schools received grant money — which in Ohio amounted to more than $2,000 per student — and were required to change in one of four ways: fire the principal and make an array of changes, including a longer school day and more rigorous teacher evaluations; make those changes and also fire half of the staff; turn the school over to a charter operator; or close the school altogether. The vast majority of schools chose the first, least disruptive option.
The San Francisco study — appearing in March in the American Educational Research Journal — compared nine schools in the city that received SIG money to similar schools that didn’t participate in the program.
A 2013 San Francisco Chronicle article reported that the grants led to an influx of staff and services: “The money bought summer school classes, computers, books, social workers, nurses, literacy and math coaches, and more. All told, 70 people were hired with the grant funding this year at nine city schools.”
“Basically, anything I need I can ask for,” Monica Giudici, a teacher at one of the nine schools, told the Chronicle.
After three years, the SIG schools had higher test scores and attendance rates, and more parents wanted to send their kids to those schools. The schools did a better job of retaining effective teachers, and offered more professional development, according to surveys of teachers. The gains were largest in the schools that dismissed half their staff.
The Ohio report, recently published through the Ohio Education Research Center, showed that students in 74 schools receiving SIG funds saw large test score gains and higher graduation rates after two or three years. The results, however, dissipated when the grant ran out, suggesting that the program led to meaningful short-term improvements but didn’t spur longer-term shifts in school quality.
However, Ohio’s “priority schools” initiative — another program pushed by the feds that required schools to make specific changes but didn’t include an infusion of money — did not produce any benefits, according to the study. That suggests that the resources were an important factor in the state’s successful turnarounds.
“Context and capacity likely played a significant role in the success of SIG across states, but we lose sight of this fact because we think of SIG as this singular program where context is irrelevant,” said Carlson.
Andy Smarick, a fellow at the American Enterprise Institute and longtime SIG critic, said he didn’t question the latest results but said it’s important to also look at where the program was less successful. “The worry with these recent studies, of course, is that people will ‘reason from the dependent variable’ — that is, find these instances of purported success and try to draw conclusions about the entirety of SIG or the entirety of turnarounds from them,” he said.
Carlson agrees that his research doesn’t answer the key question of why SIG seems to have been effective in Ohio, but less so elsewhere.
“Our work shows that SIG increased reading and math scores in Ohio, at least for schools near the eligibility threshold,” he said. “It says much less about why.”