testing 1-2-3

If state tests keep changing, should they still be used to judge struggling schools?

PHOTO: Monica Disare
Chancellor Carmen Fariña

In a packed room of educators, New York City schools Chancellor Carmen Fariña announced that the city’s turnaround program for struggling schools is making extraordinary progress.

“I want to be clear,” Fariña said. “English proficiency … increased at 59 out of the 63 [Renewal] schools. Let me say this again, 59 out of 63 schools.”

Fariña is correct, but the state offered a more tempered assessment of the scores. On Friday, State Education Commissioner MaryEllen Elia said changes to this year’s test, such as offering students unlimited time and asking fewer questions, meant last year’s scores were not an “apples-to-apples comparison” with last year’s.

Her statement underscores what critics see as a dilemma: As tests continue to change, how can officials judge yearly progress on major initiatives?

That question is particularly relevant to two high-profile programs for struggling schools — the state’s receivership program and the city’s “Renewal” school program — both of which carry penalties for underperforming schools and use test scores as one way to gauge student progress. Renewal schools are expected to show improvements between 2015 and 2017, but the test process has changed within that timeframe — and could change again next year.

“It’s like trying to judge the success of a weight loss program when you have three different scales that you can’t count on,” said Aaron Pallas, a professor of sociology and education at Teachers College at Columbia University.

City officials defended the comparison between 2015 and 2016 test results, saying the rigor of the exam remained the same this year, only the structure of the test changed. They also noted that multiple benchmarks will be used to evaluate “Renewal” schools, not just test scores.

“These tests are not easier,” Fariña said during a Monday press conference. “I want to be clear on that. These tests had the same rigor as the one they took last year.”

State officials reiterated that the tests were “comparably rigorous” to last year’s assessments and said they will review the indicators used to judge improvement in struggling schools and make sure they are “working as intended.”

Even if the comparison is flawed, some say, the test scores are still useful in assessing progress. “We have to have some point of comparison for how our students are doing, as imperfect as it might be,” said David Albert, spokesman for the New York State School Boards Association.

This year is not the first in which tests have changed — far from it. The tests have been revised multiple times over the last decade, with certain years showing large swings due to those changes.

After test scores dropped in 2013 with the introduction of Common Core standards and exams, the state vowed to revise the test to address concerns.

That process will likely take years, and during the transition period, grades 3-8 math and English tests will not be used to evaluate teachers. But state and city officials are still using those tests to judge struggling schools, and that’s a problem, said David Bloomfield, an education professor at Brooklyn College and the CUNY Graduate Center. Schools on the state’s or city’s lists of low-performing schools could face consequences, such as being taken over by an outside receiver or closed, if they fail to meet academic benchmarks.

“This isn’t a new phenomenon, it’s just that earlier there weren’t high stakes,” Bloomfield said. “Now because of the particularly short time span that’s involved, they are being used way beyond their ability for accurate measurement.”

Even without changes to the test, yearly fluctuations should be viewed carefully, said Roey Ahram, director of research and evaluation at the NYU Steinhardt Metropolitan Center for Research on Equity and the Transformation of Schools.

“You always have to look at test scores with a grain of salt, whether they are changing or not,” he said. “The asterisk is there for a reason.”

rules and regs

State shortens length of ‘gag order’ on teachers discussing Regents questions online

PHOTO: G. Tatter

After pushback from teachers, the State Education Department has changed a new provision that temporarily prohibits teachers from discussing Regents exam questions online.

The original rule stated that teachers could not use email or a listserv to discuss test questions or other specific content with other teachers until a week after the exam period ended on June 23. As Chalkbeat reported Tuesday, teachers objected, arguing that they sometimes needed to discuss questions in order to properly grade the tests or to challenge questions that seems unfair.

Under the change, tests taken between June 13 and June 16 can be discussed online beginning June 23. And for those taken between June 19 and June 22, teachers can discuss content online beginning June 27.

According to education department officials, the provision was intended to ensure that testing material did not spread online before all students had completed their exams, particularly among schools that serve students with special needs, who qualify for multiple-day testing.

“We believe that nearly all students who are testing with this accommodation will have completed their exams by these dates,” Steven Katz, director of the Office of State Assessment, wrote in a memo to school principals and leaders.

Still, longtime physics teacher Gene Gordon and former president of the Science Teachers Association of New York State noted that, to some extent, the damage was done since the amendment to the rule came out only after many teachers had already graded their exams.

“It did not have any real effect,” Gordon said.

The New York State United Teachers — which criticized the new provision on Tuesday as a “gag order” and called for its repeal — called the amendment a “clear victory” for educators. Still, NYSUT spokesman Carl Korn told Chalkbeat, “it clearly will be more helpful in the future than this year.”

Testing Testing

Calculator mix-up could force some students to retake ISTEP, and Pearson is partially to blame

PHOTO: Ann Schimke

ISTEP scores for thousands of students across the state will be thrown out this year, including at two Indianapolis private schools, according to state officials.

The mishap can be traced back to calculators. Students at 20 schools used calculators on a section of the 2017 ISTEP math test when they shouldn’t have — in at least one district because of incorrect instructions from Pearson, the company that administers the tests in Indiana.

It’s a small glitch compared to the massive testing issues Indiana experienced with its previous testing company, CTB McGraw Hill. But years of problems have put teachers, students and parents on high alert for even minor hiccups. In 2013, for example, about 78,000 students had their computers malfunction during testing. Pearson began administering ISTEP in 2016.

The calculator mix-up involving Pearson happened in Rochester Community Schools, located about two hours north of Indianapolis. About 700 students in three schools received the incorrect instructions.

Molly Deuberry, spokeswoman for the Indiana Department of Education, said that Rochester is the only district known to have received the incorrect instructions, but the state is also investigating calculator-related problems at 19 other schools.

According to federal rules, students who use calculators on non-calculator test sections must have their scores labeled as “undetermined.” Current sophomores will need to retake the test, since passing the 10th-grade exam is a graduation requirement in Indiana. Students will have multiple opportunities to do so, including during the summer, state officials said.

It’s not clear how the invalidated scores will affect those schools’ A-F letter grades. It is up to the Indiana State Board of Education to handle A-F grade appeals, which districts can request once grades are released.

“The Department and State Board will collaborate to ensure that the State Board receives sufficient detail about this incident when reviewing the appeals,” the education department said in an email.

Pearson spokesman Scott Overland said in an email that they would work with the education department to follow up on the calculator issues and correct their processes for next year.

“In some cases, Pearson inadvertently provided inaccurate or unclear guidance on the use of calculators during testing,” Overland said. “In these instances, we followed up quickly to help local school officials take corrective action.”

Here are the districts and schools the state says had students incorrectly use calculators on this year’s ISTEP:

  • Covington Christian School, Covington
  • Eastbrook South Elementary, Eastbrook Schools
  • Eastern Hancock Elementary School, Eastern Hancock County Schools
  • Emmanuel-St. Michael Lutheran School, Fort Wayne
  • Frankfort Middle School, Frankfort Community Schools
  • George M Riddle Elementary School, Rochester Community Schools
  • Lasalle Elementary School, School City of Mishawaka
  • New Haven Middle School, East Allen County Schools
  • Rochester Community Middle School, Rochester Community Schools
  • Rochester Community High School, Rochester Community Schools
  • Saint Boniface School, Lafayette
  • Saint Joseph High School, South Bend
  • Saint Roch Catholic School, Indianapolis
  • Silver Creek Middle School, West Clark Community Schools
  • St. Louis de Montfort School, Lafayette
  • Tennyson Elementary School, Warrick County Schools
  • Thomas Jefferson Elementary School, School City of Hammond
  • Trinity Christian School, Indianapolis
  • Waterloo Elementary School, DeKalb County Schools
  • Westfield Middle School, Westfield-Washington Schools

This story has been updated to include comments from Pearson.