why the jump?

What caused New York City’s state test scores to jump?

PHOTO: Monica Disare
State Education Commissioner MaryEllen Elia at the School of Diplomacy in the Bronx.

When State Commissioner MaryEllen Elia announced this year’s state test scores, she said she wasn’t sure exactly what caused such a big statewide bump — nearly 7 percent in English proficiency and 1 percent point in math.

“We cannot pinpoint exactly why the test [scores] increased,” she told reporters on Friday afternoon.

Her comments immediately turned the spike in scores into an education-world Rorschach test, and everyone saw something different in the inkblot. Mayor Bill de Blasio immediately claimed victory for the city’s almost 8 percent increase in English proficiency, while charter school advocates zeroed in on the even bigger increase in charter test scores, and researchers rolled their eyes, pointing out that test scores are an unreliable marker of progress — especially when the tests themselves have changed.

So who’s right? The answer likely involves some combination of student learning and test tweaks. We’ve compiled a list of the most prominent theories and looked at the evidence for each.

The de Blasio reforms are working

City officials wasted no time claiming de Blasio-era reforms drove the rise in test scores.

“A lot is changing, and this is pure, hard evidence that these changes are working, and we expect a lot more to come,” said de Blasio at a Monday press conference. He cited his “Renewal” program for struggling schools; his administration’s support of community schools, which offer additional services to families; and his universal pre-K push.

De Blasio’s case is supported by the fact that city proficiency rates increased more, on average, than test scores statewide. While the percentage of students passing state English scores increased by 6.6 percent, the city’s increased by 7.6 percent. Commissioner Elia also gave the city kudos, saying a renewed focus on teacher training and writing might explain the jump in scores.

State tests got easier

Could de Blasio-era reforms explain the entire increase in test scores? Probably not.

State tests across the state went up significantly — so much that Elia herself cautioned this year’s test scores are not an “apples-to-apples” comparison to last year’s. In response to the backlash over the introduction of Common Core-based assessments, officials made a number of changes to the tests this year, including shortening them and giving students unlimited time. Researchers said those changes likely explain some, if not much, of the statewide increase.

The increases “are sufficiently large that it makes me think there’s something about the difference in the tests from last year that accounts for the difference in growth,” said Aaron Pallas, a professor of sociology and education at Teachers College at Columbia University.

Charter schools are part of the answer

Just as quickly as Fariña and de Blasio celebrated the rise in scores, charter school advocates — frequent rivals of de Blasio — jumped in with their own good news.

City charter school English proficiency rate went up by 13.7 percent, beating the city’s overall average increase by a fair margin. Success Academy CEO Eva Moskowitz dismissed the rising scores at traditional district schools since they mirrored the state’s more closely and could thus be explained by the test changes, she argued. To “find real improvement,” she wrote in the New York Daily News, officials should look to charter schools instead.

New York City charter schools’ scores are analyzed separately from district schools, and so the charter growth didn’t contribute to — or account for — the city’s bump, state officials said. But their scores did contribute to the statewide increase.

The Common Core is working

There might be other explanations, but here’s the last one we’ll explore: The Common Core is working.

In 2013, state officials implemented tests aligned to the more rigorous Common Core learning standards. Experts knew the new tests would likely cause an immediate drop in scores, but officials hoped that over time, students and teachers would adjust to the new material and eventually test scores would rise.

Could this be a sign they were right? One piece of evidence to support that theory is the fact that the biggest increases in English proficiency were among third-graders, who started their elementary school education with a Common Core curriculum. Third grade proficiency levels in the state increased by 10.9 percent.

That did not go unnoticed by the Education Trust, a nonprofit that heralded the progress on state tests as a sign that higher standards work.

“The Common Core state standards and tests have been unfairly demonized and used to excuse the failures of our education system,” two leaders of the group wrote. “When we truly listen to what teachers, parents and students are saying, we know that high standards, implemented well, enable students to thrive.”

In the end, it’s likely too early to know exactly what drove the results, said Pallas, the Columbia testing expert. He is trying to isolate how much of the change has to do with test structure, as opposed to better instruction or learning. Right now, he said, parsing the two is tricky.

“There’s just too many moving parts right now,” he explained. “We’ll be able to have a better sense of what’s going on [eventually], but right now we’re in this gray area.”

 

more tweaks

For third straight year, TNReady prompts Tennessee to adjust teacher evaluation formula

PHOTO: Grace Tatter
Education Commissioner Candice McQueen announced last April that she was suspending TNReady testing for grades 3-8 for the 2015-16 school year. Now, her department is asking lawmakers to make more adjustments to the weight of student test scores in Tennessee's teacher evaluation formula.

First, Tennessee asked lawmakers to make temporary changes to its teacher evaluations in anticipation of switching to a new test, called TNReady.

Then, TNReady’s online platform failed, and the state asked lawmakers to tweak the formula once more.

Now, the State Department of Education is asking for another change in response to last year’s test cancellation, which occurred shortly after the legislative session concluded.

Under a proposal scheduled for consideration next Monday by the full House, student growth from TNReady would count for only 10 percent of teachers’ evaluation scores and 20 percent next school year. That’s compared to the 35 to 50 percent, depending on the subject, that test scores counted in 2014-15 before the state switched to its more rigorous test.

The bill, carried by Rep. Eddie Smith of Knoxville, is meant to address teachers’ concerns about being evaluated by a brand new test.

Because testing was cancelled for grades 3-8 last spring, many students are taking the new test this year for the first time.

“If we didn’t have this phase-in … there wouldn’t be a relief period for teachers,” said Elizabeth Fiveash, assistant commissioner of policy. “We are trying to acknowledge that we’re moving to a new assessment and a new type of assessment.”

The proposal also mandates that TNReady scores count for only 10 percent of student grades this year, and for 15 to 25 percent by 2018-19.

The Tennessee Education Association has advocated to scrap student test scores from teacher evaluations altogether, but its lobbyist, Jim Wrye, told lawmakers on Tuesday that the organization appreciates slowing the process yet again.

“We think that limiting it to 10 percent this year is a wise policy,” he said.

To incorporate test scores into teacher evaluations, Tennessee uses TVAAS, a formula that’s supposed to show how much teachers contributed to individual student growth. TVAAS, which is short for the Tennessee Value-Added Assessment System, was designed to be based on three years of testing. Last year’s testing cancellation, though, means many teachers will be scored on only two years of data, a sore point for the TEA.

“Now we have a missing link in that data,” Wrye said. “We are very keenly interested in seeing what kind of TVAAS scores that are generated from this remarkable experience.”

Although TVAAS, in theory, measures a student’s growth, it really measures how a student does relative to his or her peers. The state examines how students who have scored at the same levels on prior assessments perform on the latest test. Students are expected to perform about as well on TNReady as their peers with comparable prior achievement in previous years. If they perform better, they will positively impact their teacher’s score.

Using test scores to measure teachers’ growth has been the source of other debates around evaluations.

Historically, teachers of non-tested subjects such as physical education or art have been graded in part by schoolwide test scores. The House recently passed a bill that would require the state to develop other ways to measure growth for those teachers, and it is now awaiting passage by the Senate.

 

deja vu

Last year, Ritz’s computer-based testing plan was largely dismissed. Today, McCormick adopted part of it as her own.

PHOTO: Shaina Cavazos
Glenda Ritz and Jennifer McCormick debated in Fort Wayne during the 2016 campaign this past fall.

Although she wasn’t on board with former-state Superintendent Glenda Ritz’s entire testing plan during last year’s campaign, current Indiana schools chief Jennifer McCormick today expressed support for a computer-based test format Ritz lobbied hard for during her last year in office.

These “computer-adaptive” exams adjust the difficulty-level of questions as kids get right or wrong answers. McCormick explained the format to lawmakers today when she testified on the “ILEARN” proposal that could replace the state’s unpopular ISTEP exam if it becomes law.

Computer-adaptive technology, she said, allows tests to be more tailored around the student. Test experts who spoke to Indiana policymakers this past summer have said the tests also generally take less time than “fixed-form” tests like the current ISTEP and could result in quicker turnaround of results.

During the summer, members of a state commission charged with figuring out what Indiana’s new testing system could look like largely argued against this testing format, including the bill’s author, Rep. Bob Behning, R-Indianapolis. At the time, he said he was concerned about investing in a technology-heavy plan when much of the state struggles to get reliable internet and computer access. Today, Behning didn’t speak against the concept.

Overall, McCormick was supportive of House Bill 1003, but she pointed out a few areas that she’d like to see altered. More than anything, she seemed adamant that Indiana get out of the test-writing business, which has caused Hoosiers years of ISTEP-related headaches.

Read: Getting rid of Indiana’s ISTEP test: What might come next and at what cost

“Indiana has had many years to prove we are not good test-builders,” McCormick told the Senate Education Committee today. “To continue down that path, I feel, is not very responsible.”

The proposed testing system comes primarily from the recommendations of the state commission. The biggest changes would be structural: The bill would have the test given in one block of time at the end of year rather than in the winter and spring. The state would go back to requiring end-of-course assessments in high school English, Algebra I and science.

The bill doesn’t spell out if the test must be Indiana-specific or off-the-shelf, and McCormick suggested the state buy questions from existing vendors for the computer-adaptive test for grades 3-8, which would have to be aligned with state standards.

For high school, McCormick reiterated her support for using the SAT and suggested making the proposal’s end-of-course assessments optional.

The ILEARN plan, if passed into law, would be given for the first time in 2019.

“Spring of 2019 is a more realistic timeline no matter how painful it is for all of us.” McCormick said. “We could do it for (2018), but it might not be pretty. We tried that before as a state, and we couldn’t get it right.”

You can find all of Chalkbeat’s testing coverage here.