New York

Why NAEP Matters

NYC Chancellor Joel Klein's response in Wednesday's New York Times to Diane Ravitch's op-ed last week provides a lot to chew on.  Today, I'm focusing on his comments about the National Assessment of Educational Progress (NAEP), which is also known as the Nation's Report Card.  NAEP began collecting data in 1969, and remains the only federal assessment designed to report on trends in the academic performance of U.S. children and youth.  All 50 states and the District of Columbia participate in NAEP, as does New York City and an increasing number of other urban school districts.  NAEP has an annual operating budget of more than $130 million per year, which represents a significant share of federal investments in education research.  Though not an expert on testing and assessment, Diane Ravitch has a long-standing interest in NAEP—she was appointed to the bipartisan National Assessment Governing Board (NAGB), which oversees NAEP, during President Bill Clinton's second term, and remained on the board until 2004. One of the ways that NAEP differs from many other standardized tests is that NAEP is designed to yield a much wider picture of the subject-matter knowledge the test is intended to measure.  Many standardized tests are designed to provide an accurate picture of a particular child's performance.  It's efficient to do so by having all test-takers respond to the same set of test items.  If a group of fourth-graders all answer the same 45 items in a 90-minute math exam, we can learn a lot about performance on those particular items, which are chosen to be representative of the content domain they are supposed to represent (such as fourth-grade math).  But such a test would tell us little about student performance on other items that might have a different format, or address different fourth-grade math skills.  NAEP addresses this problem by having many more test items, but no child answers all of the items, because that would take hours and hours of testing time.  Instead, each child responds to a sample of the items, and the performance on these items is combined across children to yield a picture of the performance of children in general.  Testing experts such as Dan Koretz at Harvard believe that assessments such as NAEP are less vulnerable to score inflation than state assessments because it's more challenging to engage in inappropriate test preparation when there are so many potential test items a student might respond to.  But the tradeoff is that NAEP is not designed to provide a reliable and accurate measure of performance for a particular child.    Let's look at what the Chancellor had to say about NAEP:
New York

Moskowitz asks Weingarten to retract her "hypocrite" accusation

The latest in the cue-card extravaganza: Here's a letter that the former City Council member-turned-charter school operator Eva Moskowitz just sent to teachers union president Randi Weingarten, her rival. The letter is a response to Weingarten's appearance on Fox 5's Good Day New York this morning. Weingarten told Fox 5 that City Council members commonly ask the teachers union for advice on issues. She said that Moskowitz herself asked for information when she chaired the council's education committee. "I find that people shouldn't be hypocrites," Weingarten said. "Eva used to ask us all the time when she was education chair for questions to prep the City Council about, you know, what's really going on in schools." Moskowitz writes back today in a letter to Weingarten saying that the characterization is false — and demanding a retraction: I never asked the UFT or any party to propose questions for me.  I held over a hundred days of hearings as Chairperson of the Education Committee.  I demand that you identify a single instance in which I asked the UFT for questions or used questions prepared for me by the UFT.  You will be unable to find such an example because it does not exist.  In light of that, please retract your inaccurate and defamatory statement. A rivalry between Weingarten and Moskowitz burst open in 2005 when Weingarten campaigned heavily against Moskowitz's bid for borough president of Manhattan. Moskowitz had targeted labor unions in hearings when she chaired the education committee. This week, Moskowitz testified at the same hearing that drew the controversy that a "union-political complex" is holding the city back. Here's the full letter:
New York

Two efforts to improve a school, with two different sets of tools

New York

DOE releases SSO performance data; let the crunching begin

One thing that went under the radar during the nonstop news cycle of the last few weeks is a sizable data dump from the Department of Education, which for the first time released statistical reports about the 11 organizations that support the city's schools. The reports went online last week to inaugurate the period when schools can choose which organization they want to affiliate with. The organizations, called School Support Organizations, or SSOs, have provided support services to individual schools for the last two years in place of the traditional school-district bureaucracy. This is the first time that the DOE has allowed schools to change the affiliation they originally selected back in 2007. The new reports include a chart (above) comparing the SSOs according to their schools' progress report scores, quality review evaluations, and principal satisfaction survey results. The result is the public evaluation that Eric Nadelstern, the DOE's chief schools officer who formerly ran the Empowerment organization, said back in January was being cooked up the department's accountability office. The comparison, which takes into account school data from the 2007-2008 school year, shows that the SSO run by the City University of New York did the best, followed closely by the Empowerment organization. The reports are available on the DOE's Web site only in PDF format, and there is a different one for each organization. A DOE spokeswoman told me that the department had not made available a database compiling the data, so I went ahead and made one, available here or after the jump. I also went one step further and added some calculations of my own, based on the DOE's data: The percent change in progress report and quality review scores from 2007 to 2008. Among my first impressions: Schools either improved their internal operations significantly between 2007 and 2008, or else they figured out how to look like they had improved, because the percentage of schools receiving top ratings on their Quality Reviews jumped in every organization. If you have more statistics knowhow than I do and some extra time on your hands (like during this school vacation), take a look and note what you see. Leave your observations in the comments.