value added

value added?

a test of happiness

Grading teachers

New York

Integral to "value-added" is a requirement that some score low

Add one more point of critique to the city’s Teacher Data Reports: Experts and educators are worried about the bell curve along which the teacher ratings fell out. Like the distribution of teachers by rating across types of schools, the distribution of scores among teachers was essentially built into the “value-added” model that the city used to generate the ratings. The long-term goal of many education reformers is to create a teaching force in which nearly all teachers are high-performing. However, in New York City’s rankings — which rated thousands of teachers who taught in the system from 2007 to 2010 — teachers were graded on a curve. That is, under the city’s formula, some teachers would always be rated as “below average,” even if student performance increased significantly in all classrooms across the city. The ratings were based on a complex formula that predicts how students will do — after taking into account background characteristics — on standardized tests. Teachers received scores based on students’ actual test results measured against the predictions. They were then divided into five categories. Half of all teachers were rated as “average,” 20 percent were “above average,” and another 20 percent were “below average.” The remaining 10 percent were divided evenly between teachers rated as “far above average” and “far below average.” IMPACT, the District of Columbia’s teacher-evaluation system, also uses a set distribution for teacher ratings. As sociologist Aaron Pallas wrote in October 2010, “by definition, the value-added component of the D.C. IMPACT evaluation system defines 50 percent of all teachers in grades four through eight as ineffective or minimally effective in influencing their students’ learning.”
New York

City's value-added initiative early entrant to evolving landscape

New York City schools erupted in controversy last week when the school district released its “value-added” teacher scores to the public after a yearlong battle with the local teachers union. The city cautioned that the scores had large margins of error, and many education leaders around the country believe that publishing teachers’ names alongside their ratings is a bad idea. Still, a growing number of states are now using evaluation systems based on students’ standardized test-scores in decisions about teacher tenure, dismissal, and compensation. So how does the city’s formula stack up to methods used elsewhere? The Hechinger Report has spent the past 14 months reporting on teacher-effectiveness reforms around the country and has examined value-added models in several states. New York City’s formula, which was designed by researchers at the University of Wisconsin-Madison, has elements that make it more accurate than other models in some respects, but it also has elements that experts say might increase errors — a major concern for teachers whose job security is tied to their value-added ratings. “There’s a lot of debate about what the best model is,” said Douglas Harris, an expert on value-added modeling at the University of Wisconsin-Madison who was not involved in the design of New York’s statistical formula. The city used the formula from 2007 to 2010 before discontinuing it, in part because New York State announced plans to incorporate a different formula into its teacher evaluation system.
New York

Why it's no surprise high- and low-rated teachers are all around

The New York Times' first big story on the Teacher Data Reports released last week contained what sounded like great news: After years of studies suggesting that the strongest teachers were clustered at the most affluent schools, top-rated teachers now seemed as likely to work on the Upper East Side as in the South Bronx. Teachers with high scores on the city's rating system could be found "in the poorest corners of the Bronx, like Tremont and Soundview, and in middle-class neighborhoods," "in wealthy swaths of Manhattan, but also in immigrant enclaves," and "in similar proportions in successful and struggling schools," the Times reported. Education analyst Michael Petrilli called the findings "jaw-dropping news" that "upends everything we thought we knew about teacher quality." Except it's not really news at all. Value-added measurements like the ones used to generate the city's Teacher Data Reports are designed precisely to control for differences in neighborhood, student makeup, and students' past performance. The adjustments mean that teachers are effectively ranked relative to other teachers of similar students. Teachers who teach similar students, then, are guaranteed to have a full range of scores, from high to low. And, unsurprisingly, teachers in the same school or neighborhood often teach similar students. “I chuckled when I saw the first [Times story], since the headline pretty much has to be true: Effective and ineffective teachers will be found in all types of schools, given the way these measures are constructed,” said Sean Corcoran, a New York University economist who has studied the city’s Teacher Data Reports.
New York

City releases ratings for teachers in charter, District 75 schools

The Department of Education released a final installment of Teacher Data Reports today, for teachers in charter schools and schools for the most severely disabled students. Last week, the city released the underlying data from about 53,000 reports for about 18,000 teachers who received them during the project's three-year lifespan. Teachers received the reports between 2008 and 2010 if they taught reading or math in grades 4 through 8. When the department first announced that it would be releasing the data in response to several news organizations' Freedom of Information Law requests, it indicated that ratings for teachers in charter schools would not be made public. It reversed that decision late last week and today released "value-added" data for 217 charter school teachers. Participation in the data reports program was optional for charter schools and some schools entered and exited the program in each year that it operated, with eight schools participating in 2007-2008 and 18 participating in 2009-2010. At the time, the city had about 100 charter schools. The department also released reports for 50 teachers in District 75 schools, which enroll the city's most severely disabled students. The number is small because few District 75 students take regular state math and reading exams. Also, District 75 classes are typically very small, and privacy laws led the city to release data for teachers who had more than 10 students take state tests. District 75 also teachers received reports only in 2008 and 2010; the program was optional in the district's schools in 2009. Department officials cautioned last week that the reports had high margins of error — 35 percentage points for math teachers and 53 percentage points for reading teachers, on average — and urged caution when interpreting them.