To our readers

Hey, we heard you. You had a lot of questions about TNReady. We found answers.

The news that Tennessee’s testing company scored some high school tests incorrectly this year uncorked a flood of questions about the validity of the state’s new standardized assessment.


Here are five things to know about the latest brouhaha over TNReady


We wanted to know how the brouhaha was impacting classrooms, so we asked our readers on Facebook.

You responded in droves.

We took your top concerns directly to the state Department of Education and asked for answers. Here’s what you wanted to know — and what we have learned:

Several readers asked why they should trust TNReady results, given the series of setbacks in the test’s first two years.

  • “I do not trust the results. We have had so many problems in the last few years, that I am suspicious of any results we do get. It bothers me greatly that the state uses these numbers to hold students and teachers and districts accountable, but they seem to be unable to deliver scores they believe are accurate in a timely manner.” —Rebecca Dickenson
  • “I no longer trust the accountability of the state nor its methods. My concern is if there is a teacher who has only one year of test data, how is it the same teacher shown multi-year growth when he or she had only last year of testing? This poses a huge concern.” —Mildred Williams  

Tennessee Department of Education: “TNReady is fully aligned to Tennessee’s academic standards, and every question has been reviewed, edited, and approved by Tennessee teachers through a rigorous review process. We also have quantitative checks and processes after a test is over to ensure student responses are reliable. While more than 99.9% of TNReady tests were scored accurately this year, we want to improve on that next year, and our vendor (Questar) is taking new quality assurance steps to make sure their programming is error-free. Also, this year, as soon as the scoring error on some of the English I, II and Integrated Math II EOCs was identified, scores were updated and all TNReady tests were re-reviewed and verified for full accuracy.”

Some teachers told us that, given the delay in score deliveries this spring, many students don’t think the results will arrive in time to affect their final grades next spring. Those teachers are struggling to get their students to buy in.

  • “After two years of TNReady, it still hasn’t counted for my students. Going into year three, I will once again tell them with a hopeful, straight face that it will count as part of their report card grades and implore them to try their best. I quietly wonder what reason they have to believe me, given recent history.” —Mike Stein
  • “I struggle to get students to buy in to the importance of trying their best on state tests because the students are confident that the scores won’t come back in time to affect their grades (which has been the situation for several years now). The students see zero incentive for doing well.” —Nicole Mayfield

TDOE: “We believe that if districts and schools set the tone that performing your best on TNReady is important, then students will take the test seriously, regardless of whether TNReady factors into their grade. We should be able to expect our students will try and do their best at any academic exercise, whether or not it is graded. This is a value that is established through local communication from educators and leaders, and it will always be key to our test administration. We believe that when we share these messages and values celebrating the variety of accomplishments our students have made, taking advantage of TNReady’s scheduling flexibility to minimize disruption, focusing on strong standards-based instruction every day, sending positive messages around the importance of the variety of tests that students take, and sharing that students should always do their best then students will buy-in and TNReady will be successful.”

Other teachers asked what happens to writing scores for tests in English language arts.

  • “I can tell you that two years ago — when we first piloted the new writing test online — districts received not only every student’s scores (broken down by each of the four indicators) but also the actual student responses to each prompt. In my former district our supervisor shared them, and we analyzed them as a department. If you check with your principal, VP, or supervisors, there are some published “anchor papers” with scores available on edtools from this past year. It’s not a lot, but it’s more than we’ve had in the past. My hope is that if online continues, we’ll keep seeing the student responses in the future.” —Wj Gillespie II

TDOE: “The question appears to be referencing the process we had through the 2014-15 school year, when our writing assessment was separate. Since 2015-16, students’ writing responses on TNReady have been incorporated as part of their overall ELA score. Responses are scored based on our writing rubrics, and for educators, we have provided access to the “anchor papers” from the 2016-17 year, so they can see how students’ responses were scored based on the writing rubric, which can help them inform the feedback they give their students.”

On that same issue of writing scores, one teacher referenced the hiring of scorers off of Craigslist. We asked the state if that’s true.

  • “I continue to be curious about our ELA writing scores. Each year we are required to use state writing rubrics, attend PD related to the state’s four types of writing, etc etc…and yet our scores never come back. Students spend hours taking the writing portion of the test, scorers are hired off Craig’s list…, and yet we never actually get the scores back. It seems like every year this is swept under the rug. Where do these writing tests go?” —Elizabeth Faison Clifton

TDOE: “Questar does not use Craigslist. Several years ago, another assessment company supposedly posted advertisements on Craigslist, but Questar does not. We provide opportunities for our educators to be involved in developing our test, and we also encourage Tennessee teachers to apply to hand-score TNReady. To be eligible, each applicant must provide proof of a four-year college degree, and preference is given to classroom teachers. As part of the interview process, an applicant would have to hand-score several items for review and evaluation. Once hired, each scorer is trained based on materials that Tennessee teachers and the department approve — and which are assembled from responses given by Tennessee students on the exam — and scorers are regularly refreshed and “recalibrated” on scoring guidelines. Each writing response is scored at least twice; if those two responses differ significantly, they are sent to a third scorer. Each day, the department reads behind a sample of essays to ensure hand-scorers are adhering to the criteria set by our teachers. Any scores that do not align are thrown out, and those scorers are retrained. Any scorer who does not meet our and Questar’s standards is released from scoring TNReady.”

Finally, readers expressed a lot of concern about the complexity behind growth scores known as TVAAS, which are based on TNReady results and which go into teachers’ evaluations. We asked the state for a simple explanation.

  • “What formula is used in calculating the overall score for TVAAS when fallacies were determined as a result? My performance is weighed heavily on the state TVAAS score which is why this type of error has occurred before. This is quite disturbing. Teachers work tirelessly to ensure student achievement is a success; however, testing to measure performance seems to not be working.” —Mildred Williams  
  • “No one can give me the formula for how my students’ scores are calculated to create my score in TVAAS. How is (t)hat transparency? Yet, I’m required, constantly, to “prove” myself with documentation of education, observations, professional development and the like; all in originals, of course, to numerous overseeing bodies.” —Rachel Bernstein Kannady
  • “I find it ludicrous that data from these tests are used to evaluate MY performance when I get little to no control over most of the variables regarding the test. How could a miscalculated, misinformed, and (for all I know) incomprehensible test demonstrate what my students have learned!? And don’t even get me started on that fact that the rigor of the tests was increased ten-fold, yet not scaffolded in.” —Nicole Mayfield

TDOE: “TVAAS is statistically valid and reliable, and we follow the recommendations outlined by the American Educational Research Association (AERA) on value-added measures. Conceptually, TVAAS looks at how students have performed historically on TCAP and TNReady and compares their performance to their peers who have had similar past performance. If students tended to grow at about the same rate as their peers across the state — the expected amount of growth — they would earn a 3. If students tended to grow faster than their peers, they would earn a 4 or a 5, depending on the amount of progress they showed. If they tended to not show as much growth as their peers, they would earn a 1 or a 2. The model itself is sophisticated and complex to be as fair and nuanced as possible for each teacher’s situation, and we are working with our educator preparation providers as well as district leaders to provide more training on specifically how the model calculates scores. Tennessee educators also have access to a TVAAS user support team that can answer any specific questions about their TVAAS data, including how the data was analyzed.

Because TVAAS always looks at relative growth from year to year, not absolute test scores, it can be stable through transitions — and that is what we saw this year. Students can still grow, even if their overall proficiency level is now different. You can think about it like a running race. If you used to finish a 5K at about the same time as 10 other students, and all 10 students made the same shift to a new race at the same time with the same amount of time to prepare, you should finish the new race at about the same time. If you finished ahead of the group’s average time, you grew faster than your peers. If you lagged behind everyone, that would indicate you did not grow as much as was expected.  Because students’ performance will be compared to the performance of their peers and because their peers are making the transition at the same time, drops in statewide proficiency rates resulting from increased rigor of the new assessments had no impact on the ability of teachers, schools, and districts to earn strong TVAAS scores. Transitions to higher standards and expectations do not change the fact that we still want all students in a district to make a full year’s worth of growth, relative to their peers who are all experiencing the same transition.”

Reporter Laura Faith Kebede contributed to this report.

missed opportunities

A new report argues that students are suffering through bad teaching and simplistic classwork. Is that true?

PHOTO: Anthony Lanzilote

America’s public school classrooms are full of students who aren’t being challenged.

That’s the claim of a new report by TNTP, the nonprofit advocacy and consulting group, looking at student work and real-life teaching. Students are “planning their futures on the belief that doing well in school creates opportunities — that showing up, doing the work, and meeting their teachers’ expectations will prepare them for what’s next,” it says. “Unfortunately, it’s a myth.”

The study, called The Opportunity Myth, relies on TNTP’s exhaustive effort to get at what students are really doing in class by surveying them in real time, reviewing their work, and observing class instruction — a combination rarely seen in education research. Based on this data, the report argues that low-income students of color in particular are suffering through mediocre instruction and simplistic classwork while their teachers expect little of them.

“Students spent more than 500 hours per school year on assignments that weren’t appropriate for their grade and with instruction that didn’t ask enough of them,” the report says.

It’s not clear that the study’s methods can support such strong conclusions, though. TNTP’s claims turn on its own subjective way of rating instruction and assignments, and it’s unclear if different approaches would yield different results. And the paper examined just four districts and one charter school network, all anonymous.

That means the study is at once extensive and limited: extensive because it amounts to a massive undertaking to better understand students’ experience, but limited because it only examines a fraction of students in a fraction of classrooms in a handful of districts, none of which were chosen randomly.

Regardless of debates about the methods, the report may draw significant attention. The research of TNTP, previously known as the The New Teacher Project, has a track record of shaping policy, particularly with an influential 2009 report known as The Widget Effect, which focused on perceived flaws of teacher evaluation systems.

The latest study was funded by the Joyce Foundation, the Chan Zuckerberg Initiative, the Nellie Mae Education Foundation, the Walton Family Foundation, the Overdeck Family Foundation, and the Barr Foundation. (Chan Zuckerberg, Joyce, Overdeck, and Walton are also funders of Chalkbeat.)

In contrast with some of the group’s past work, the latest report concludes with few controversial policy recommendations, instead calling for higher expectations and a careful examination of disparities in school resources.

“We believe it’s time to move beyond important but narrow debates — from how to measure teacher performance to charter versus district to the role of standardized testing — and return to the basic guiding principle that brings us to this work: the right of every student to learn what they need to reach their goals,” the report says.

TNTP focuses on three large urban districts, one small rural district, and one charter network with three schools in separate cities. In all but the rural district, a majority of students are black or Latino, as well as low-income.

From there, TNTP got a handful of teachers from certain schools in each district to document their students’ work, collecting and photographing the assignments done by six students during three separate weeks. (Students had to receive parental consent and their names were removed from the work.) TNTP then assigned a rating to each significant piece of work, looking at whether it was on grade level, among other traits.

TNTP also had observers watch and then rate two lessons by each teacher using the group’s rubrics and surveyed teachers to determine their views on whether students could meet their state’s academic standards.

Finally, they surveyed students on their classroom experiences. TNTP used a novel approach for tracking student feelings, asking students whether they were bored or felt excited about learning at various points in a lesson.

In all, TNTP says, it reviewed nearly 1,000 lessons, 20,000 examples of student work, and 30,000 real-time student surveys. And the results, the report said, are grim: only 16 percent of lessons observed had “strong instruction,” and about a quarter of assignments were deemed “grade appropriate.”

This varied from district to district and classroom to classroom. In the most specific example provided in the report, one eighth-grade assignment asked a student to fill in the missing vowels from the word “habitat” after reading a short passage; in contrast, another required students to write a lengthy essay based on a memoir of one of the students to desegregate the all-white high school in Little Rock, Arkansas.

Student surveys were somewhat more positive: a narrow majority of students, about 55 percent, were generally engaged and interested during class, based on TNTP’s survey.

Students of color and low-income students tended to be in classes with worse instruction, fewer grade-level assignments, and lower expectations for meeting standards. That was correlated with slightly lower rates of test score growth.

All of that, TNTP concludes, amounts to a damning case against most of the classrooms in question and American schools in general. “Students spend most of their time in school without access to four key resources: grade-appropriate assignments, strong instruction, deep engagement, and teachers who hold high expectations,” the report says.

“The ‘achievement gap,’ then, isn’t inevitable. It’s baked into the system, resulting from the decisions adults make.”

But TNTP faces steep challenges in using its data to make such strong claims.

In addition to the districts, schools, teachers, and students not being chosen randomly, the report is not able to pin down whether those resources lead to higher achievement or definitively show why some students seem to have less access to the key resources it cites.

One of the report’s central claims, that increasing access to those resources will boost students’ academic performance, rests on relatively small correlations. In fact, the study showed little if any overall relationship between teachers’ observation scores and their effects on test scores.

“We need to be a little careful about asserting that by increasing one or more of the four resources we will necessarily improve outcomes for kids,” said Jim Wyckoff, a professor at the University of Virginia who sat on an advisory panel for the report, while also noting that he thought the basic theory of the report made sense.

The report’s appendix notes that “classrooms with initially higher performing students tended to get better assignments, better instruction, were more engaged, and had teachers with significantly higher expectations.” But other research has shown that observers tend to give unfairly high ratings to classrooms with more high-achieving students, meaning cause and effect could run the other way here.

TNTP’s measure of teacher expectations relies on teachers’ responses to statements like “My students need something different than what is outlined in the standards,” something that may be conflating high expectations with teachers’ views about the quality of their state standards.

Still, one of the main takeaways from the report — that low-income students of color have less access to good teaching — is generally backed by past research.

High-poverty schools have higher rates of teacher turnover and more inexperienced teachers, on average. Other research in a number of states and cities, including Washington state, North Carolina, New York City, and Los Angeles, has shown that teachers of low-income students are less effective at raising test scores.

Are Children Learning

Chicago is sending more high schoolers to college — but how to get them to graduate?

PHOTO: Adeshina Emmanuel / Chalkbeat
Mayor Rahm Emanuel, CPS CEO Janice Jackson, and other city officials convened at Michele Clark Magnet High School in the Austin neighborhood to announce the latest college enrollment statistics.

Senior Tanariya Thompson, 17, said she and her friends at Michele Clark Magnet High School are constantly asking each other about where they want to go to college. But they’re not just talking, they’re doing their research, too.

“In a lot of our seminar classes I see more kids on the computers applying for colleges instead of just sitting there looking or saying, ‘I ain’t going to college,’” she said. “We’re serious: We want to go to a college so we can become somebody. Next week, I will have my top three.”

Chicago Public Schools released data today showing that more students than ever before are enrolling in college. The mayor and district officials announced the encouraging figures on the West Side, at Michele Clark High School, where students said they’ve seen more energy, excitement and urgency among their peers around the idea of enrolling at college.

The data shows that 1,000 more Chicago Public School graduates from the Class of 2017 enrolled in college compared with 2016, a 4.8 percent increase and the biggest one-year jump in nearly a decade.

Chicago still has a problem with public school graduates staying in and completing college. In 2016, just 18 percent of ninth graders were projected to attain a bachelor’s degree within six years of high school graduation, and four-year college graduation rates have remained pretty stagnant since 2009, according to a fall 2017 report by the UChicago Consortium on School Research. (The report didn’t calculate two-year degree attainment).

But Mayor Rahm Emanuel called the latest enrollment data “an incredible statement about where Chicago Public School students are,” adding that nearly 90 percent of high school freshmen were on track for graduation.

“Every time they walk around and say, ‘not those kids, not from that school, not that background, not that ZIP code, not that family’ — you come here to Michele Clark and you tell these kids that,” Emanuel said, knocking on the wooden podium before him for emphasis.  “You guys have proved them wrong every step of the way.”

From 2010 to 2017, the college enrollment rate increased from 53.7 percent to 64.6 percent, according to the school district.  Officials credited everything from partnerships with OneGoal and other organizations focused on getting kids to and through college, to a summer text messaging campaign to nudge graduates toward completing action items along the enrollment path, and scholarships to city colleges for students who attain a B average or higher.

They also noted a shift in perspective.

“I think it’s because people have become more serious,” said Michele Clark Principal Charles Anderson. “I’ve seen it in action with people doing more college trips, people getting out to scholarship fairs, students having a different mindset.”

From 2016 to 2017, college enrollment rates for African-American and Latino students improved by 2.3 percentage points and 7.2 percentage points, respectively, according to the school district. The African-American college enrollment rate increased from 55.4 percent in 2016 to 57.7 in 2017, and the Hispanic college enrollment rate leaped from 59 percent in 2016 to 66.2 percent in 2017, according to district data.

Flanked by Chicago schools chief Janice Jackson and City Colleges Chancellor Juan Salgado, Emanuel said, “it used to be as a system, we were done just getting you to high school graduation, and our responsibility was over,” but now it’s different. The mayor added, “the biggest transformation is the mindset not just of our kids, but of the system.”

“It’s why we’re also making sure we set a goal that by 2019, every child has a plan for what comes next,” Emanuel said, alluding to a new CPS graduation requirement that demands every student “has a meaningful planning conversation with an adult, and graduates with a plan to map out their future.”

The data indicate more students are enrolling at City College of Chicago.

The district said 5.8 percent more students enrolled at city colleges in 2017 compared with the previous year. Of district graduates who attended two-year colleges in 2017, 84.5 percent enrolled at city colleges compared with 78.7 the previous year, according to the district. City Colleges Chancellor Juan Salgado praised the mayor and schools chief’s leadership, saying CPS’ gains were strong steps toward officials’ goals of “a more inclusive economy,” in Chicago.

“We also want to make sure that each of you has in a role in this economy, whether it’s downtown, or in our health-care centers, or at a logistics company, or engineering or manufacturing company or a tech company,” Salgado told the students. “This city will have a place for you.”

Officials said the climbing college enrollment rate mirrored the increasing number of district students earning high school diplomas, and also reflected district students’ overall strong academic progress. Yet the percent of students who enrolled in college in 2015 and were still enrolled the following year, 72.3 percent of graduates, is actually down slightly compared with 2010, when it was 72.8 percent.

That — and the low rates of Chicago Public School students who eventually graduate with a two- or four-year degree — are worrisome figures.

Furthermore, African-American and Latino students and students with disabilities still graduate from high school, enroll in and graduate from college at lower rates than the general population. It’s a sobering reminder of inequities in the school system.

Officials acknowledged that work remains to get more students to and through college.

That point that wasn’t lost on Michele Clark senior Naquanis Hughes, 17, who wants to study business in college but is still undecided on where. Hughes said staff, students, and even alumni offer this encouragement about getting through the hard knocks that some students encounter in higher education:

“If you come to a hard place, don’t just fall down, don’t just give up, keep pushing yourself.”