To our readers

Hey, we heard you. You had a lot of questions about TNReady. We found answers.

The news that Tennessee’s testing company scored some high school tests incorrectly this year uncorked a flood of questions about the validity of the state’s new standardized assessment.


Here are five things to know about the latest brouhaha over TNReady


We wanted to know how the brouhaha was impacting classrooms, so we asked our readers on Facebook.

You responded in droves.

We took your top concerns directly to the state Department of Education and asked for answers. Here’s what you wanted to know — and what we have learned:

Several readers asked why they should trust TNReady results, given the series of setbacks in the test’s first two years.

  • “I do not trust the results. We have had so many problems in the last few years, that I am suspicious of any results we do get. It bothers me greatly that the state uses these numbers to hold students and teachers and districts accountable, but they seem to be unable to deliver scores they believe are accurate in a timely manner.” —Rebecca Dickenson
  • “I no longer trust the accountability of the state nor its methods. My concern is if there is a teacher who has only one year of test data, how is it the same teacher shown multi-year growth when he or she had only last year of testing? This poses a huge concern.” —Mildred Williams  

Tennessee Department of Education: “TNReady is fully aligned to Tennessee’s academic standards, and every question has been reviewed, edited, and approved by Tennessee teachers through a rigorous review process. We also have quantitative checks and processes after a test is over to ensure student responses are reliable. While more than 99.9% of TNReady tests were scored accurately this year, we want to improve on that next year, and our vendor (Questar) is taking new quality assurance steps to make sure their programming is error-free. Also, this year, as soon as the scoring error on some of the English I, II and Integrated Math II EOCs was identified, scores were updated and all TNReady tests were re-reviewed and verified for full accuracy.”

Some teachers told us that, given the delay in score deliveries this spring, many students don’t think the results will arrive in time to affect their final grades next spring. Those teachers are struggling to get their students to buy in.

  • “After two years of TNReady, it still hasn’t counted for my students. Going into year three, I will once again tell them with a hopeful, straight face that it will count as part of their report card grades and implore them to try their best. I quietly wonder what reason they have to believe me, given recent history.” —Mike Stein
  • “I struggle to get students to buy in to the importance of trying their best on state tests because the students are confident that the scores won’t come back in time to affect their grades (which has been the situation for several years now). The students see zero incentive for doing well.” —Nicole Mayfield

TDOE: “We believe that if districts and schools set the tone that performing your best on TNReady is important, then students will take the test seriously, regardless of whether TNReady factors into their grade. We should be able to expect our students will try and do their best at any academic exercise, whether or not it is graded. This is a value that is established through local communication from educators and leaders, and it will always be key to our test administration. We believe that when we share these messages and values celebrating the variety of accomplishments our students have made, taking advantage of TNReady’s scheduling flexibility to minimize disruption, focusing on strong standards-based instruction every day, sending positive messages around the importance of the variety of tests that students take, and sharing that students should always do their best then students will buy-in and TNReady will be successful.”

Other teachers asked what happens to writing scores for tests in English language arts.

  • “I can tell you that two years ago — when we first piloted the new writing test online — districts received not only every student’s scores (broken down by each of the four indicators) but also the actual student responses to each prompt. In my former district our supervisor shared them, and we analyzed them as a department. If you check with your principal, VP, or supervisors, there are some published “anchor papers” with scores available on edtools from this past year. It’s not a lot, but it’s more than we’ve had in the past. My hope is that if online continues, we’ll keep seeing the student responses in the future.” —Wj Gillespie II

TDOE: “The question appears to be referencing the process we had through the 2014-15 school year, when our writing assessment was separate. Since 2015-16, students’ writing responses on TNReady have been incorporated as part of their overall ELA score. Responses are scored based on our writing rubrics, and for educators, we have provided access to the “anchor papers” from the 2016-17 year, so they can see how students’ responses were scored based on the writing rubric, which can help them inform the feedback they give their students.”

On that same issue of writing scores, one teacher referenced the hiring of scorers off of Craigslist. We asked the state if that’s true.

  • “I continue to be curious about our ELA writing scores. Each year we are required to use state writing rubrics, attend PD related to the state’s four types of writing, etc etc…and yet our scores never come back. Students spend hours taking the writing portion of the test, scorers are hired off Craig’s list…, and yet we never actually get the scores back. It seems like every year this is swept under the rug. Where do these writing tests go?” —Elizabeth Faison Clifton

TDOE: “Questar does not use Craigslist. Several years ago, another assessment company supposedly posted advertisements on Craigslist, but Questar does not. We provide opportunities for our educators to be involved in developing our test, and we also encourage Tennessee teachers to apply to hand-score TNReady. To be eligible, each applicant must provide proof of a four-year college degree, and preference is given to classroom teachers. As part of the interview process, an applicant would have to hand-score several items for review and evaluation. Once hired, each scorer is trained based on materials that Tennessee teachers and the department approve — and which are assembled from responses given by Tennessee students on the exam — and scorers are regularly refreshed and “recalibrated” on scoring guidelines. Each writing response is scored at least twice; if those two responses differ significantly, they are sent to a third scorer. Each day, the department reads behind a sample of essays to ensure hand-scorers are adhering to the criteria set by our teachers. Any scores that do not align are thrown out, and those scorers are retrained. Any scorer who does not meet our and Questar’s standards is released from scoring TNReady.”

Finally, readers expressed a lot of concern about the complexity behind growth scores known as TVAAS, which are based on TNReady results and which go into teachers’ evaluations. We asked the state for a simple explanation.

  • “What formula is used in calculating the overall score for TVAAS when fallacies were determined as a result? My performance is weighed heavily on the state TVAAS score which is why this type of error has occurred before. This is quite disturbing. Teachers work tirelessly to ensure student achievement is a success; however, testing to measure performance seems to not be working.” —Mildred Williams  
  • “No one can give me the formula for how my students’ scores are calculated to create my score in TVAAS. How is (t)hat transparency? Yet, I’m required, constantly, to “prove” myself with documentation of education, observations, professional development and the like; all in originals, of course, to numerous overseeing bodies.” —Rachel Bernstein Kannady
  • “I find it ludicrous that data from these tests are used to evaluate MY performance when I get little to no control over most of the variables regarding the test. How could a miscalculated, misinformed, and (for all I know) incomprehensible test demonstrate what my students have learned!? And don’t even get me started on that fact that the rigor of the tests was increased ten-fold, yet not scaffolded in.” —Nicole Mayfield

TDOE: “TVAAS is statistically valid and reliable, and we follow the recommendations outlined by the American Educational Research Association (AERA) on value-added measures. Conceptually, TVAAS looks at how students have performed historically on TCAP and TNReady and compares their performance to their peers who have had similar past performance. If students tended to grow at about the same rate as their peers across the state — the expected amount of growth — they would earn a 3. If students tended to grow faster than their peers, they would earn a 4 or a 5, depending on the amount of progress they showed. If they tended to not show as much growth as their peers, they would earn a 1 or a 2. The model itself is sophisticated and complex to be as fair and nuanced as possible for each teacher’s situation, and we are working with our educator preparation providers as well as district leaders to provide more training on specifically how the model calculates scores. Tennessee educators also have access to a TVAAS user support team that can answer any specific questions about their TVAAS data, including how the data was analyzed.

Because TVAAS always looks at relative growth from year to year, not absolute test scores, it can be stable through transitions — and that is what we saw this year. Students can still grow, even if their overall proficiency level is now different. You can think about it like a running race. If you used to finish a 5K at about the same time as 10 other students, and all 10 students made the same shift to a new race at the same time with the same amount of time to prepare, you should finish the new race at about the same time. If you finished ahead of the group’s average time, you grew faster than your peers. If you lagged behind everyone, that would indicate you did not grow as much as was expected.  Because students’ performance will be compared to the performance of their peers and because their peers are making the transition at the same time, drops in statewide proficiency rates resulting from increased rigor of the new assessments had no impact on the ability of teachers, schools, and districts to earn strong TVAAS scores. Transitions to higher standards and expectations do not change the fact that we still want all students in a district to make a full year’s worth of growth, relative to their peers who are all experiencing the same transition.”

Reporter Laura Faith Kebede contributed to this report.

more digging

Kingsbury High added to list of Memphis schools under investigation for grade changing

PHOTO: Shelby County Schools
Kingsbury High School was added to a list of schools being investigated by an outside firm for improper grade changes. Here, Principal Terry Ross was featured in a Shelby County Schools video about a new school budget tool.

Another Memphis high school has been added to the list of schools being investigated to determine if they made improper changes to student grades.

Adding Kingsbury High School to seven others in Shelby County Schools will further delay the report initially expected to be released in mid-June.

But from what school board Chairwoman Shante Avant has heard so far, “there haven’t been any huge irregularities.”

“Nothing has surfaced that gives me pause at this point,” Avant told Chalkbeat on Thursday.

The accounting firm Dixon Hughes Goodman is conducting the investigation.

This comes about three weeks after a former Kingsbury teacher, Alesia Harris, told school board members that Principal Terry Ross instructed someone to change 17 student exam grades to 100 percent — against her wishes.

Shelby County Schools said the allegations were “inaccurate” and that the grade changes were a mistake that was self-reported by an employee.

“The school administration immediately reported, and the central office team took the necessary actions and promptly corrected the errors,” the district said in a statement.

Chalkbeat requested a copy of the district’s own initial investigation the day after Harris spoke at the board’s June meeting, but district officials said they likely would not have a response for Chalkbeat until July 27.

Harris said that no one from Dixon Hughes Goodman has contacted her regarding the investigation as of Thursday.

The firm’s investigation initially included seven schools. Kingsbury was not among them. Those seven schools are:

  • Kirby High
  • Raleigh-Egypt High
  • Bolton High
  • Westwood High
  • White Station High
  • Trezevant High
  • Memphis Virtual School

The firm’s first report found as many as 2,900 failing grades changed during four years at nine Memphis-area schools. At the request of the board, two schools were eliminated: one a charter managed by a nonprofit, and a school outside the district. The firm said at the time that further investigation was warranted to determine if the grade changes were legitimate.

The $145,000 investigation includes interviews with teachers and administrators, comparing teachers’ paper grade books to electronic versions, accompanying grade change forms, and inspecting policies and procedures for how school employees track and submit grades.

Since the controversy started last year, the district has restricted the number of employees authorized to make changes to a student’s report card or transcript, and also requires a monthly report from principals detailing any grade changes.

Silver Lining Playbook

Memphis’ youngest students show reading gains on 2018 state tests — and that’s a big deal

PHOTO: Caroline Bauman
A student works on reading comprehension skills at Lucie E Campbell Elementary School in Memphis and Shelby County Schools.

Those working to improve early literacy rates in Shelby County Schools got a small morale boost Thursday as newly released scores show the district’s elementary school students improved their reading on 2018 state tests.

The percentage of Memphis elementary-age students considered proficient in reading rose by 3 points to almost one-fourth of the district’s children in grades 3 through 5. That’s still well below the state average, and Superintendent Dorsey Hopson said “we obviously have a long way to go.”

PHOTO: Caroline Bauman
Superintendent Dorsey Hopson has overseen Tennessee’s largest public school district since 2013.

Strengthening early literacy has been a priority for the Memphis district, which views better reading skills as crucial to predicting high school graduation and career success. To that end, Shelby County Schools has expanded access to pre-K programs, adjusted reading curriculum, and made investments in literacy training for teachers.

Hopson said the payoff on this year’s TNReady scores was a jump of almost 5 percentage points in third-grade reading proficiency.

“It was about five years ago when we really, really, really started pushing pre-K, and those pre-K kids are now in the third grade. I think that’s something that’s really positive,” Hopson said of the gains, adding that third-grade reading levels are an important indicator of future school performance.

TNReady scores for Shelby County Schools, which has a high concentration of low-performing schools and students living in poverty, were a mixed bag, as they were statewide.

Math scores went up in elementary, middle, and high schools in Tennessee’s largest district. But science scores went down across the board, and the percentage of high school students who scored proficient in reading dropped by 4 percentage points.

The three charts below illustrate, by subject, the percentages of students who performed on track or better in elementary, middle, and high schools within Shelby County Schools. The blue bars reflect the district’s most recent scores, the black bars show last year’s scores, and the yellow bars depict this year’s statewide averages.

Hopson said he was unsure how much the scores of older students — all of whom tested online — were affected by technical problems that hampered Tennessee’s return this year to computerized testing.

“From what people tell me, kids either didn’t try as hard in some instances or didn’t take it seriously,” Hopson told reporters. “We’ll never know what the real impact is, but we have to accept the data that came from these tests.”

But students in two of the district’s school improvement initiatives — the Innovation Zone and the Empowerment Zone — showed progress. “We’re going to double down on these strategies,” Hopson said of the extra investments and classroom supports.

In the state-run Achievement School District, or ASD, which oversees 30 low-performing schools in Memphis, grades 3 through 8 saw an uptick in scores in both reading and math. But high schoolers scored more than 3 percentage points lower in reading and also took a step back in science.

The ASD takes over schools in the state’s bottom 5 percent and assigns them to charter operators to improve. But in the five years that the ASD has been in Memphis, its scores have been mostly stagnant.

Tennessee Education Commissioner Candice McQueen said she and new ASD Superintendent Sharon Griffin are reviewing the new data to determine next steps.

“We are seeing some encouraging momentum shifts,” McQueen said.

Chalkbeat illustrator Sam Park contributed to this story.