data

outside the box

big data

you got data

making it clear

hard data

data-driven

Self Examination

data disparity

TCAP

New York

Even with no model middle school, city expands literacy push

New York

Annual survey reflects sanguine views of school performance

A slide from the Department of Education's presentation of this year's Learning Environment Survey results shows teachers' responses to questions about their evaluations. Results of the city's annual survey of what parents, students and teachers think about their schools paints a much rosier picture than data on school performance indicate. It also offers a rosier picture of teachers' views of their evaluation system than both city and union officials have painted in the past. This year, 94 percent of parents said they were "satisfied" with their children's education, and 95 percent of students said they have to "work hard to get good grades" — figures city officials touted as a sign that the schools are becoming more rigorous. Answering a new question, 94 percent of teachers said their school "does a good job supporting students who aspire to go to 2- or 4-year colleges." Those responses suggest that city parents, students, and teachers remain sanguine about their schools even as the city and state have mounted a concerted effort to raise expectations. The Learning Environment Survey results, which the city published today, come on the heels of annual state test scores that showed for the second straight year that fewer than half of the city's third through eighth graders are reading at grade level. And while the city's "college-readiness" rate inched up since it was first announced last year, only about a quarter of students meet the city's and state's standards. The survey results do signal that some schools are beginning to ask more of their students. Since 2009, the proportion of high school students who say they are receiving "helpful" college and career counseling has risen from 74 to 82 percent. And while the number of students reporting sophisticated research or essay assignments barely budged, the number who said they had been asked to "complete an essay or project where [they] had to use evidence to defend [their] own opinion or ideas" three or more times increased sharply, from 62 percent in 2011 to 67 percent this year.
New York

Into crowded field of school data comes a user-friendly report

Insideschools introduced its new school data tool, "Inside Stats" at a panel discussion on school assessment. When Jacqueline Wayans helped her second daughter pick a high school, they were confident about their choice. After all, Wayans is a savvy parent who had worked for years visiting and reviewing schools for Insideschools, the online guide to city schools. Her older daughter had attended a city school with an arts theme and gotten a good education, and her younger daughter's top pick, Manhattan's High School for Fashion Industries, had gotten an "A" from the Department of Education. It wasn't until after her daughter enrolled that Wayans learned Fashion Industries only offered three years of math classes. And when the school added a fourth math class, she didn't find out until it was too late that her daughter's scores were too low for her to qualify. Now, when Wayans's daughter starts college this fall, she'll need to take remedial math. "I just assumed that there was a four-year sequence," Wayans said today during a panel discussion about metrics for assessing high schools that Insideschools hosted. "My older daughter had it at her high school and I just thought it was there." Wayans isn't alone in trusting a small sliver of information to make the potentially life-changing decision about where to attend high school. Some parents and students choose schools by their names, their sports teams, or their neighborhoods, without digging deep to understand what kind of education the schools offer. Now entering its second decade, Insideschools (where I also worked from 2005 to 2008) is preparing to launch a tool to help parents like Wayans — and those far less savvy than she is — make better choices. The tool, called "Inside Stats," is a consumer-oriented presentation of public data about high schools that is meant to complement, or perhaps even rival, the information the city distributes.
New York

Charter sector report delayed weeks while schools verify data

New York

DOE priorities seen in fresh tweaks to progress report formula

In an education department that's driven by data, what gets measured is a clear expression of values. So this year's elementary and middle school progress reports signal that the city is serious about integrating disabled students into regular classes, helping minority boys, and quickly getting immigrant students learning in English. The broad contours of what we'll see later today when the Department of Education releases the newest progress reports, based on the last school year, have been clear for months. Back in the spring, the DOE told principals that it would not insulate schools against steep score drops as it did last year, so we know that more schools will get failing grades that put them at risk of closure. In fact, the department set a fixed distribution of scores: 25 percent of schools will get As, 35 percent Bs, 30 percent Cs, 7 percent Ds, and 3 percent Fs. Last year, just 5 percent of schools were awarded D or F grades. We also know each school's state test scores, announced last month. While high or low average scores don't always equate to high or low progress report grades, because the reports are based mostly on the test scores, they often do. (The department is also guaranteeing that schools with test scores in the top third citywide get no lower than a C; last year, only schools in the top quarter got that promise.) Also, because fewer schools registered large test score gains or losses this year, progress report grades are likely to be relatively stable. That means that the biggest changes could come as the result of the department's annual tinkering with the reports' formula.
New York

A stab at a cleaner, more user-friendly look at city test score data

Click on the image to go straight to the new data below. When the state and city education officials released the 2010-2011 ELA and Math test data on Monday, they didn't make it easy for interested New Yorkers to make sense of the scores. One spreadsheet, released by the city Department of Education, left off school names and corresponded results only by school code. It also excluded public charter schools entirely. The state's spreadsheet included names, but listed every other public school in New York State as well. There was also no easy way to compare schools to one another. The city included a comparison against previous years' scores, but the file didn't allow users to compare change over time among schools. The state's data didn't include any previous scores at all. Not surprisingly, many of our readers emailed us to express their frustration over the scattered and unwieldy data. When I asked a DOE spokesman Matthew Mittenthal about it, he told me that grouping the data into school-by-school comparisons wasn't a priority when publishing the information. "We would never use test scores alone for accountability purposes, so we don’t actively encourage people to compare one school to another on that basis," Mittenthal wrote in an email. We spent the past couple of days playing with the spreadsheets so that it's easier and more intuitive. First, we corresponded codes used by the DOE to actual school names (for example, 15K447 = The Math & Science Exploratory School). Then, we stripped non-essential data and added last year's test results as a column header. Finally, we filtered the schools by performance so the best-scoring are at the top.
New York

Principals are optimistic about ARIS, but kinks continue

Nearly two thirds of principals say the Department of Education's $81 million online data warehouse could help improve teaching and learning at their schools.  The finding is among the results of a survey conducted by Public Advocate Betsy Gotbaum's office, which released a statement today emphasizing that more than a third of principals did not think the system was helping their schools. In its coverage of Gotbaum's report, the New York Times billed the system as being "supported by most principals," And the city has said that its internal survey results show that most principals see benefits to the system. ARIS's solid approval rating doesn't mean all of its kinks have been worked out. The Manhattan School for Children's parent coordinator sent the following e-mail to parents last week: ARIS and Classroom Assignments It has come to my attention that the classroom teacher assignments have been posted on ARIS and I have been trying to unravel the mystery as to how these assignments came to be posted. I have also discovered that there are many mistakes. The official letters from MSC will be sent at the end of August. I am also out of town and cannot access the ID numbers that many parents are now requesting. Please double check the letters that you received from your classroom teacher. Both numbers were given out at the same time. Again, you will be notified about your official class by mail. Please do not rely on the ARIS site for this information. The parent who forwarded me the e-mail said the incorrect information has been removed from the system but new information hasn't yet been uploaded. (The system opened to parents in May.)
New York

DOE releases SSO performance data; let the crunching begin

One thing that went under the radar during the nonstop news cycle of the last few weeks is a sizable data dump from the Department of Education, which for the first time released statistical reports about the 11 organizations that support the city's schools. The reports went online last week to inaugurate the period when schools can choose which organization they want to affiliate with. The organizations, called School Support Organizations, or SSOs, have provided support services to individual schools for the last two years in place of the traditional school-district bureaucracy. This is the first time that the DOE has allowed schools to change the affiliation they originally selected back in 2007. The new reports include a chart (above) comparing the SSOs according to their schools' progress report scores, quality review evaluations, and principal satisfaction survey results. The result is the public evaluation that Eric Nadelstern, the DOE's chief schools officer who formerly ran the Empowerment organization, said back in January was being cooked up the department's accountability office. The comparison, which takes into account school data from the 2007-2008 school year, shows that the SSO run by the City University of New York did the best, followed closely by the Empowerment organization. The reports are available on the DOE's Web site only in PDF format, and there is a different one for each organization. A DOE spokeswoman told me that the department had not made available a database compiling the data, so I went ahead and made one, available here or after the jump. I also went one step further and added some calculations of my own, based on the DOE's data: The percent change in progress report and quality review scores from 2007 to 2008. Among my first impressions: Schools either improved their internal operations significantly between 2007 and 2008, or else they figured out how to look like they had improved, because the percentage of schools receiving top ratings on their Quality Reviews jumped in every organization. If you have more statistics knowhow than I do and some extra time on your hands (like during this school vacation), take a look and note what you see. Leave your observations in the comments.
New York

A tour of schools data around the country – California (LA), Denver, Houston

In reflecting on transparency in government, I thought I'd take a look around the country at a few other urban school districts to see how they make data available to the public. Are there school districts out there that are models for all in terms of making data accessible? Today, LA, Denver, and Houston. Tomorrow, DC, Chicago, and Baltimore. If there are other cities you think I should look at, leave a comment. Next week, we'll see what users in each of these cities have to say about the availability of data - if you're from one of the featured cities and can provide perspective, please email me. Also, what tools would be most helpful to you as someone interested in education? In exploring each site, I looked to see what information is available, in what format, how quickly I found it, and whether special tools were available to help me navigate the data and answer my own questions. Please keep in mind that since I'm not from these other cities, I'm a "naive user" of these sites, perhaps similar to a parent or community member interested in but not expert at finding what's out there. If I've missed anything on any of the sites I visited, let me know so I can update this. Screenshot of California's STAR system Starting out west, I spent a few minutes at the LA Unified School District homepage, which relatively quickly led me to the California Department of Education's Standardized Testing and Reporting (STAR) system, a tool that allows you to search at different levels (county, district, school), by subgroup, and view or download tables of information. Both mean scale scores and the percentage of students at each proficiency level are reported. What's problematic is that to compare subgroups or years, you have to create separate reports for each category you want to compare (e.g., first request 2006 data, then request 2007 data, then compare on your own); the tool would be immensely more powerful if it allowed you to select two or more subgroups or years for comparison. Summary tables comparing different subgroups and different years are available with the 2007 press release, but only for some kinds of data (proficiency statistics are compared but not scale scores, for example).