Even as policymakers are putting more emphasis on test score growth, it’s becoming less important to principals, according to a new study.

A team of researchers from Vanderbilt’s Peabody College of Education and Human Development looked at the key factors used by principals to make decisions about teacher hiring, contract renewal, classroom assignments and professional development in six urban school districts in five states. Two districts were in Tennessee: Shelby County Schools and Metropolitan Nashville Public Schools.

They found that, as teacher observation rubrics become increasingly detailed, principals rely on that qualitative data far more than value-added measures.

Value-added measures are used to estimate how much teachers contribute to students’ growth on statewide assessments. By law in Tennessee, value-added scores (TVAAS) comprise up to 35 percent of teachers’ evaluation scores. They can be used in decisions about firing a teacher. School-wide, TVAAS scores are used to make decisions about which schools are kept open.

Day to day, however, principals rarely consider them, according to the study, which was released last week. Only 18 percent of principals reported regularly reviewing teachers’ value-added scores.

One reason that principals likely are more interested in what they see than what they find in students’ test score growth is that value-added is complicated, said Jason Grissom, a lead researcher. It’s complex to the point that few people fully understand value-added, even in the education research community, he acknowledged.

“The algorithms are complicated, the statistical adjustments are messy. . . . People tend to not put a lot of stock into what they don’t understand,” he said. “Principals will say, ‘I don’t know that (value-added) is necessarily a valid measure of what a teacher’s kids were last year, because I don’t know exactly what value-added equation is spitting out. But I know what I saw; I know what I observed.”

Eighty-four of the 100 principals interviewed found observation data to be “valid to a large extent,” while only 56 percent felt the same about value-added.

The timing of the release of value-added scores also makes them of little use to principals, Grissom said. Principals learn teachers’ value-added scores in the late fall, long after they’ve assigned teachers to classrooms and hired new ones.

Finally, principals find the specificity of observations more useful, Grissom said. The rubrics used in the states that participated in the study, including Tennessee, cover everything from planning to overseeing student group work, so principals have a detailed knowledge of teachers’ strengths.

“The  level of specificity in observation is a big reason why they’re being used so much,” Grissom said. “Value-added is just a number.”

Tennessee educators: Does this study ring true with your experience? We welcome your comments below.

Contact Grace Tatter at gtatter@chalkbeat.org

Follow us on Twitter: @GraceTatter, @chalkbeattn

Like us on Facebook

Sign up for our newsletter for regular updates on Tennessee education news