First Person

The Flat Earth Society

Today’s New York Daily News published a bold editorial on the progress of New York City schoolchildren under the administration of Mayor Mike Bloomberg and Chancellor Joel Klein.  “You would be better off arguing that the world is flat, or that the sun revolves around the Earth, than to dispute that New York City kids are performing better and better in school,” writes the Daily News, crowing that there are “fresh and incontrovertible data” pointing to what the newspaper refers to as a “sea change” in New York City. 

They might have wanted to wait a day.

This morning, the U.S. Department of Education released the 2009 results of the National Assessment of Educational Progress assessments of fourth-grade and eighth-grade mathematics in each state and for the nation overall.  Nationally, fourth-grade performance held steady from 2007 to 2009, and there was a slight but statistically significance over this period in eighth-grade math performance.  In New York State, the small declines in fourth-grade and gains in eighth grade were not statistically significant, leading to the conclusion that there has been no change in the performance of New York students on the NAEP math assessment from 2007 to 2009. 

This is a very different story than the one told by New York’s own assessment system, on which the Bloomberg and Klein administration has staked its claims about the great progress in student achievement.  The average scale score in fourth-grade mathematics increased from 680 in 2007 to 689 in 2009, a hefty 9 points;  the jump in eighth-grade scores was even more dramatic, as the average scale score rose from 657 in 2007 to 675 in 2009, a remarkable increase of 18 points.

To put these two sets of numbers in context, the chart below shows the gains in fourth-grade and eighth-grade math performance from 2007 to 2009 expressed in standard deviation units (i.e., the amount of variation among individual students in 2007).  According to NAEP, fourth-graders’ performance fell .07 standard deviations from 2007 to 2009, a difference that is not significantly different from zero.  In contrast, fourth-graders gained .23 standard deviations on the New York State assessment from 2007 to 2009.  Similarly, the NAEP results indicate that eighth-graders in New York gained .08 standard deviations from 2007 to 2009 in math performance, a difference that is not significantly different from zero, but they gained .47 standard deviations over this period on the New York State test.

flat-earth

Another way of comparing the implications of the two different sets of test results is to think about where the average student in 2009 would have scored in 2007.  Based on these standard deviations, and assuming that the scores follow a bell-curve distribution, the New York State scores indicate that the average fourth-grader in 2009 scored at the 59th percentile of the 2007 fourth-grade distribution, which is a pretty big jump.  The increment for eighth-graders is even more striking:  the average eighth-grader in 2009 scored at the 68th percentile of the 2007 eighth-grade distribution, based on the New York State tests.  In contrast, the NAEP data indicate that the average New York fourth-grader in 2009 scored at the 47th percentile of the 2007 distribution of fourth-grade math performance in New York State, and the average eighth-grader in 2009 scored at the 53rd percentile of the 2007 eighth-grade distribution.
 
How can we explain these differences?  There are lots of possible explanations, but most of them don’t hold up under close scrutiny.  The two tests are taken by similar populations of students under similar conditions, and the grade-level mathematics standards on which the two assessments are based do not differ dramatically.  The NAEP test is a low-stakes test, which might result in students not taking it seriously, but the statisticians who oversee the NAEP testing program look for patterns suggesting this, and find little evidence of it.  It’s extremely unlikely that there’s rampant cheating going on in the New York State testing system that could explain the differences. 

 It’s possible that the New York State tests have been getting easier over time.  I have yet to see definitive evidence ruling this out.  There also is strong suggestive evidence of “score inflation” in the New York State tests, because there are predictable patterns in the standards which appear on the state tests year after year, with some standards showing up repeatedly each year, and some standards having never been tested at all during the life of the testing program.  Schools and teachers can make use of these patterns, which also show up in the format of test questions covering particular standards, to focus their instruction on the subset of standards that crop up again and again.  Because the New York State tests never test some standards, we have no idea about whether students have mastered them.  In contrast, the design of the NAEP assessment allows for a much broader picture of mathematics performance, because so many more standards and test item formats are incorporated into the test.

 Whatever the reason, the discrepancy between the NAEP trends and trends in the NewYork State test scores raises serious questions about what the New York tests are telling us about the academic performance of students in New York State.  The same, of course, goes for New York City.  We’ll see NAEP scores for New York City in a month or so, but it’s unlikely that they will yield a different story than what I’m describing here.
 
Is the Earth flat?  No.  But New York State test scores, and probably New York City scores, are.

First Person

What we’ve learned from leading schools in Denver’s Luminary network — and how we’ve used our financial freedom

PHOTO: Nicholas Garcia
Cole Arts and Science Academy Principal Jennifer Jackson sits with students at a school meeting in November 2015.

First Person is a standing feature where guest contributors write about pressing issues in public education. Want to contribute? More details here

Three years ago, we were among a group of Denver principals who began meeting to tackle an important question: How could we use Colorado’s innovation schools law to take our schools to the next level?

As leaders of innovation schools, we already had the ability to make our own choices around the curriculum, length of school day, and staffing at our campuses. But some of us concluded that by joining forces as an independent network, we could do even more. From those early meetings, the Luminary Learning Network, Denver’s first school innovation zone, was born.

Now, our day-to-day operations are managed by an independent nonprofit, but we’re still ultimately answerable to Denver Public Schools and its board. This arrangement allows us to operate with many of the freedoms of charter schools while remaining within the DPS fold.

Our four-school network is now in its second year trying this new structure. Already, we have learned some valuable lessons.

One is that having more control over our school budget dollars is a powerful way to target our greatest needs. At Cole Arts & Science Academy, we recognized that we could serve our scholars more effectively and thoughtfully if we had more tools for dealing with children experiencing trauma. The budget flexibility provided by the Luminary Learning Network meant we were able to provide staff members with more than 40 hours of specially targeted professional development.

In post-training surveys, 98 percent of our staff members reported the training was effective, and many said it has helped them better manage behavioral issues in the classroom. Since the training, the number of student behavior incidents leading to office referrals has decreased from 545 incidents in 2016 to 54 in 2017.

At Denver Green School, we’ve hired a full-time school psychologist to help meet our students’ social-emotional learning goals. She has proved to be an invaluable resource for our school – a piece we were missing before without even realizing how important it could be. With a full-time person on board, we have been able to employ proactive moves like group and individual counseling, none of which we could do before with only a part-time social worker or school psychologist.

Both of us have also found that having our own executive coaches has helped us grow as school leaders. Having a coach who knows you and your school well allows you to be more open, honest, and vulnerable. This leads to greater professional growth and more effective leadership.

Another lesson: scale matters. As a network, we have developed our own school review process – non-punitive site visits where each school community receives honest, targeted feedback from a team of respected peers. Our teachers participate in a single cross-school teacher council to share common challenges and explore solutions. And because we’re a network of just four schools, both the teacher council and the school reviews are small-scale, educator-driven, and uniquely useful to our schools and our students. (We discuss this more in a recently published case study.)

Finally, the ability to opt out of some district services has freed us from many meetings that used to take us out of our buildings frequently. Having more time to visit classrooms and walk the halls helps us keep our fingers on the pulse of our schools, to support teachers, and to increase student achievement.

We’ve also had to make trade-offs. As part of the district, we still pay for some things (like sports programs) our specific schools don’t use. And since we’re building a new structure, it’s not always clear how all of the pieces fit together best.

But 18 months into the Luminary Learning Network experiment, we are convinced we have devised a strategy that can make a real difference for students, educators, and school leaders.

Watch our results. We are confident that over the next couple of years, they will prove our case.

Jennifer Jackson is the principal of Cole Arts & Science Academy, which serves students from early childhood to grade five with a focus on the arts, science, and literacy. Frank Coyne is a lead partner at Denver Green School, which serves students from early childhood to grade eight with a focus on sustainability.

First Person

Let’s be careful with using ‘grading floors.’ They may lead to lifelong ceilings for our students

PHOTO: Helen H. Richardson, The Denver Post

I am not a teacher. I am not a principal. I am not a school board member. I am not a district administrator (anymore).

What I am is a mother of two, a high-schooler and middle-schooler. I expect them both to do their “personal best” across the board: chores, projects, personal relationships, and yes, school.

That does not mean all As or Bs. We recognize the sometimes arbitrary nature of grades. (For example, what is “class participation” — is it how much you talk, even when your comments are off topic?) We have made it very clear that as long as they do their “personal best,” we are proud.

That doesn’t mean, though, that when someone’s personal best results in a poor grade, we should look away. We have to ask what that grade tells us. Often, it’s something important.

I believe grading floors — the practice (for now, banned in Memphis) of deciding the lowest possible grade to give a student — are a short-sighted solution to a larger issue. If we use grade floors without acknowledging why we feel compelled to do so, we perpetuate the very problem we seek to address.

"If we use grade floors without acknowledging why we feel compelled to do so, we perpetuate the very problem we seek to address."Natalie McKinney
In a recent piece, Marlena Little, an obviously dedicated teacher, cites Superintendent Hopson’s primary drive for grade floors as a desire to avoid “creat[ing] kids who don’t have hope.” I am not without empathy for the toll failing a course may take on a student. But this sentiment focuses on the social-emotional learning aspect of our students’ education only.

Learning a subject builds knowledge. Obtaining an unearned grade only provides a misleading indication of a child’s growth.

This matters because our students depend on us to ensure they will be prepared for opportunities after high school. To do this, our students must possess, at the very least, a foundation in reading, writing and arithmetic. If we mask real academic issues with grade floors year after year, we risk missing a chance to hold everyone — community, parents, the school board, district administration, school leaders, teachers, and students — accountable for rectifying the issue. It also may mean our students will be unable to find employment providing living wages, resulting in the perpetuation of generational poverty.

An accurate grade helps the teacher, parents, and district appropriately respond to the needs of the student. And true compassion lies in how we respond to a student’s F. It should act as an alarm, triggering access to additional work, other intervention from the teacher or school, or the use of a grade recovery program.

Ms. Little also illustrates how important it is to have a shared understanding about what grades should mean. If the fifth-grade boy she refers to who demonstrates mastery of a subject orally but has a problem demonstrating that in a written format, why should he earn a zero (or near-zero) in the class? If we agree that grades should provide an indicator of how well a student knows the subject at hand, I would argue that that fifth-grade boy should earn a passing grade. He knows the work! We don’t need grade floors in that case — we need different ideas about grades themselves.

We should also reconsider the idea that an F is an F. It is not. A zero indicates that the student did not understand any of the work or the student did not do any of the work. A 50 percent could indicate that the student understood the information half the time. That is a distinction with a difference.

Where should we go from here? I have a few ideas, and welcome more:

  1. In the short term, utilize the grade recovery rules that allow a student to use the nine weeks after receiving a failing grade to demonstrate their mastery of a subject — or “personal best” — through monitored and documented additional work.
  2. In the intermediate term, create or allow teachers to create alternative assessments like those used with students with disabilities to accommodate different ways of demonstrating mastery of a subject.
  3. In the long term, in the absence of additional money for the district, redeploy resources in a coordinated and strategic way to help families and teachers support student learning. Invest in the development of a rich, substantive core curriculum and give teachers the training and collaboration time they need.

I, like Ms. Little, do not have all the answers. This is work that requires our collective brilliance and commitment for the sake of our children.

Natalie McKinney is the executive director of Whole Child Strategies, Inc., a Memphis-based nonprofit that provides funding and support for community-driven solutions for addressing attendance and discipline issues that hinder academic success. She previously served as the director of policy for both Shelby County Schools and legacy Memphis City Schools.