Future of Schools

Schools, board members ask: Are Indiana's grades fair?

Schools across Indiana received their report cards today, with the state rating the highest scorers an “A” and the lowest laggards an “F,” terminology well known to the schoolchildren they serve.

But unlike the grade a student receives from a teacher, the state’s grades are not based on daily interactions and observation but on complex mathematical formulas.

This year, perhaps more than ever before, there are reasons to ask: is your school’s grade fair?

“I think it’s worth looking at,” board member Andrea Neal said. “I’m very uncomfortable with the formula.”

Two big problems plagued this year’s grading results: test administration and unpredictability.

Testing woes

Almost 80,000 ISTEP online test takers in May experienced glitches that caused their screens to freeze, or otherwise slowed or stopped their exams. Some schools with widespread glitches have raised concerns that their grades were adversely affected. ISTEP is the backbone of Indiana’s accountability system. Student test scores in grades three through eight are central to judging students, teachers and schools. The number of students who pass and the amount of growth they make over the prior year help determine a teacher’s raise and job security and a school’s A to F grade.

But after last spring’s testing problems, many Indiana educators have raised questions about whether they and their schools can be fairly judged on this year’s scores.

Christel House Academy today was the first school to push back on its grade when it received an F after more than five years of A grades. School officials said their data showed more than 90 percent of students whose grades went from passing to failing had faced online testing trouble. About 40 percent of Christel House’s test takes faced online glitches, but the school’s appeal was denied.

State board member Dan Elsener said the school could now appeal to the state board.

“There’s some reason there’s an anomaly here,” he said. “There’s a whole cohort of schools that don’t like the grades they got because of testing interruptions.”

Elsner said there was little the board could do but approve the grades, despite concerns they might not be accurate for all schools, because an outside consultant determined that very few students were so affected by the glitches that their tests were invalid. That’s the only advice they have to go on, Elsener said.

State Superintendent Glenda Ritz said the education department doubled checked the data, going student-by-student to be certain any tests that should have been invalidated were not included in the school’s results. If all of the scores counted toward the grade were valid, then the state must affirm the grade, she said.

Claire Fiddian-Green of the Center for Education and Career Innovation, an education agency created by Gov. Mike Pence that has often been at odds with Ritz, this time backed her up.

“I’m comfortable they conducted a thorough process with all the right steps,” Fiddian-Green said.

Yet, there is every reason to believe student scores can be affected by unexpected interruptions, said Cynthia Roach, director of research, evaluation and assessment for Indianapolis Public Schools. Just 1,400 tests were invalidated because the state’s consultant determined they were adversely affected, but Roach believes there were probably more students who should have had higher scores.

“It’s almost impossible for a student to take a test and score higher than what they know,” she said. “But it’s very easy to score lower than what they know. Everything affects kids.”

Even the consultant who evaluated the testing problems for Indiana last summer acknowledged that there was no way to definitively identify all students who likely would have had higher scores, Roach said.

Roach told the story of one IPS principal who reported a huge problem with frozen computer screens during ISTEP testing that plagued the students in her school’s gifted class. Afterward questions lingered about how the school’s grade could have been affected, even though those students were likely to pass either way.

“They did fine but was the freeze enough to affect their ability to get high growth?” Roach said. “Who knows?”

Unpredictability

When schools make drastic swings, such as from A one year to F the next or from F to A, a common explanation is that there have also been big changes in the school, such as an influx of new students or heavy turnover of teachers, Roach said.

But in some cases, Indiana schools that have seen none of those sorts of changes are unable to explain sudden reversals of fortune. More than a handful of schools making such big shifts makes even the state’s superintendent, Ritz, wonder if the problem is with the system, not the schools.

“A good system will show you have a school improving or a school not improving but not extremes like we are currently seeing in the current model,” she said.

Among the big swings this year are three schools that went from an A or B to an F and 25 schools that went from an F to an A or B. In fact, the oddities of Indiana’s current A to F formula, forged under former state Superintendent Tony Bennett, have Ritz pining for a planned overhaul.

The legislature earlier this year ordered the universally disliked growth measure junked and mandated a new system be created in 2014. Ritz, one of the current system’s critics, said the new system should eliminate most big shifts in school grades.

Stopping short of saying this year’s grades can’t be trusted, Ritz  focused on a future with new grading rules.

“We’ve had many schools where we have a fluctuation between two, three or four letter grades, up or down,” she said “I am very excited we are going to be implement, not this year but next year, a new A to F system. We are working toward that, an entire new system for A to F.”

A growth measure in the grade calculation aims to identify which schools did the best job of getting students to raise their test scores. It matches up pools of kids with similar backgrounds who scored about the same on prior tests and ranks them by the progress they made over the previous year. Those with the biggest gains earned extra points for their schools. But from the beginning, a wide range of critics, including some of Bennett’s closest allies, said the measure was too complicated and worried that it could produce unfair results.

Until 2011, the first year letter grades were instituted, Indiana followed a fairly basic formula for grading schools. It required at least 60 percent of students in a school to pass both math and English on ISTEP and high school tests in order to earn at least a D. Grades went up to a C at 70 percent, a B at 80 percent and an A at 90 percent. Schools that saw their passing rates improve enough from the prior year could get extra credit and potentially move up to a higher grade on the grading scale.

In 2012, Bennett scrapped that system, adding in new factors that aimed to measure “college and career readiness” that included the growth model, based on Colorado’s grading system.

But even if they know a new grading scheme is on the way, some board members remain uneasy with this year’s grades.

Frantically calling her late Thursday, Neal said, a Gary principal was certain errors caused his school’s grade to drop but an appeal was denied.

Neal said she’d rather the state simply report each school’s state test passing rates and how much they improved over the prior year, avoiding the difficulties of explaining how the grades were determined.

“I don’t feel its working for all schools,” she said.

Neal pointed to Park Tudor, an expensive and highly regarded private school in Indianapolis, which received a D grade despite 100 percent of its graduates going on to college and a slew of academic honors, as another example of a strange report card result.

Park Tudor spokeswoman Cathy Chapelle said its grade, too, was in error.

“The assessment grade reflects issues of reporting and communication, not of academic performance,” Chapelle said in a statement. “In fact, our academic standards and results are among the highest in the state. In 2013 alone, 201 Park Tudor students in grades 9-12 took a total of 490 Advanced Placement exams; 62% of the exams earned a score of 4 or 5 and over 87% earned a score of 3 or higher.”

Chapelle did not elaborate on what the school meant by “reporting and communication” or how it could have influenced Park Tudor’s grade.

If schools like Christel House and Park Tudor decide to appeal to the state board, would they prevail? Elsener was not encouraging, suggesting the best strategy might be just to move on.

“I think I’d say this year was a hiccup,” he said. “You have to decide where to put your best investment of time.”

voucher verdict

Do vouchers help students get to college? Two new studies come to different answers

PHOTO: Micaela Watts

The debate around school vouchers has exploded in the last year with the appointment of Secretary of Education Betsy DeVos. That also means recent studies showing that student achievement drops, at least initially, when students use public dollars to attend private schools have gotten a lot of attention.

But supporters have countered that test scores only say so much about student performance. The real test is how students do over the long term.

Two studies out Friday offer new answers — and some ammunition for both sides.

The research looks at how students from Milwaukee and Washington, D.C. fared after using a voucher to attend private school. It found students in Milwaukee’s voucher program were more likely to attend four-colleges, but not necessarily more likely to actually graduate. In D.C., voucher recipients were no more likely to enroll in college.

Here’s what else the studies tell us.

Disappointing results for D.C. voucher program

The D.C. analysis, conducted by Matt Chingos of the Urban Institute, found that 43 percent of students who won a voucher enrolled in college within two years of graduating high school. That’s 3 percentage points lower than similar students who lost the lottery, though the difference was not statistically significant.

The research relied on that random lottery for allocating vouchers in the first two years of the program. This meant the study could confidently show that any difference between lottery winners and losers was caused by the program, which was created in 2004 and has been a source of controversy ever since.

The study notes that because the sample size of students is fairly small, it can’t rule out the possibility that the program either boosted or hurt college attendance to some degree.

The results are surprising in light of past evidence that the first groups of D.C. voucher participants were more likely to graduate high school and scored higher on reading tests. (A more recent study on the program, focusing on students who participated in later years, found that it caused substantial drops in math test scores.)

Milwaukee voucher recipients more likely to attend — but not necessarily graduate — college

The Milwaukee study offers a more positive story for voucher advocates.

Voucher students were generally more likely to enroll in college, particularly four-year universities, than students with similar test scores from the same neighborhood who were not participating in the program in 2006. For instance, among students who used a voucher in elementary or middle school, 47 percent enrolled in college, compared to 43 percent of similar students.

When it came to actually completing college, though, the results were less clear. The researchers estimated that voucher recipients had a small edge — 1 or 2 percentage points — but the difference was not statistically significant.

MPCP is the Milwaukee voucher program; MPS is Milwaukee Public Schools

In contrast to the D.C. study, the Milwaukee researchers — Patrick Wolf, John Witte, and Brian Kisida — weren’t able to use a random lottery, meaning the results are less definitive. And although the researchers try to make apples-to-apples comparisons, the estimates may be skewed if more motivated families, or students who were struggling in public schools, used a voucher.

The latest results are consistent with a previous Milwaukee study by some of the same researchers. It’s also similar to a recent Florida study suggesting that vouchers led to increases in two-year college enrollment, but had little or no effect on whether students earned a degree.

(Both the Milwaukee and D.C. studies were funded by a number of groups that support school choice, including the Oberndorf Foundation, the Walton Family Foundation, and the Foundation for Excellence in Education. Walton is also a funder of Chalkbeat.)

What we still don’t know

Like the research before it, these studies won’t come close to ending the debate about school vouchers. Opponents will likely highlight the results in D.C. and the inconsistent impact on college completion in Milwaukee. School choice advocates will point to other parts of the Milwaukee study, and the fact that the D.C. voucher programs appeared to keep pace with public schools while spending less per student.

Meanwhile, these studies tell us most about these programs as they existed more than a decade ago. That’s the disadvantage of studies like these of longer-run effects, even as they provide more information about metrics more important to most policymakers and parents than test scores.

“The problem with these long-term studies is that these are the right outcomes to look at, but by the time we know it, it’s of more questionable relevance,” Chingos said.

Future of Schools

Mike Feinberg, KIPP co-founder, fired after misconduct investigation

PHOTO: Photo by Neville Elder/Corbis via Getty Images

Mike Feinberg, the co-founder of the KIPP charter network, has been fired after an investigation into sexual misconduct, its leaders announced Thursday.

KIPP found “credible evidence” connected to allegations that Feinberg abused a student in the late 1990s, according to a letter sent to students and staff. Feinberg denies the allegations.

“We recognize this news will come as a shock to many in the KIPP Team and Family as we struggle to reconcile Mr. Feinberg’s 24 years of significant contributions with the findings of this investigation,” the letter says.

It’s a stunning move at one of the country’s best-known charter school organizations — and one where Feinberg has been in a leadership role for more than two decades. Feinberg started KIPP along with Dave Levin in Houston in 1994, and Levin brought the model to New York City the next year. The network became known for its “no excuses” model of strict discipline and attention to academic performance.

KIPP says it first heard the allegation last spring. The network eventually hired the law firm WilmerHale to conduct an external investigation, which found evidence that Feinberg had sexually harassed two adults, both alums of the school who were then employed by KIPP in Houston, the network said.

“In light of the nature of the allegations and the passage of time, critical facts about these events may never be conclusively determined. What is clear, however, is that, at a minimum, Mr. Feinberg put himself into situations where his conduct could be seriously misconstrued,” KIPP wrote in the letter, signed by CEO Richard Barth and KIPP’s Houston leader, Sehba Ali.

Feinberg’s lawyer, Chris Tritico, told the Houston Chronicle that Feinberg had not been fully informed about the allegations against him.

“The treatment he received today from the board that he put in place is wrong, and it’s not what someone who has made the contributions he’s made deserves,” Tritico said.

Read KIPP’s full letter here.