The news that Tennessee’s testing company scored some high school tests incorrectly this year uncorked a flood of questions about the validity of the state’s new standardized assessment.
We wanted to know how the brouhaha was impacting classrooms, so we asked our readers on Facebook.
You responded in droves.
Welcome to Chalkbeat
Chalkbeat is an independent nonprofit news organization telling the story of education in America. Learn more.
Education news. In your inbox. Sign up for our email newsletter
Education news. In your inbox. Sign up for our email newsletter
We took your top concerns directly to the state Department of Education and asked for answers. Here’s what you wanted to know — and what we have learned:
Several readers asked why they should trust TNReady results, given the series of setbacks in the test’s first two years.
- “I do not trust the results. We have had so many problems in the last few years, that I am suspicious of any results we do get. It bothers me greatly that the state uses these numbers to hold students and teachers and districts accountable, but they seem to be unable to deliver scores they believe are accurate in a timely manner.” —Rebecca Dickenson
- “I no longer trust the accountability of the state nor its methods. My concern is if there is a teacher who has only one year of test data, how is it the same teacher shown multi-year growth when he or she had only last year of testing? This poses a huge concern.” —Mildred Williams
Tennessee Department of Education: “TNReady is fully aligned to Tennessee’s academic standards, and every question has been reviewed, edited, and approved by Tennessee teachers through a rigorous review process. We also have quantitative checks and processes after a test is over to ensure student responses are reliable. While more than 99.9% of TNReady tests were scored accurately this year, we want to improve on that next year, and our vendor (Questar) is taking new quality assurance steps to make sure their programming is error-free. Also, this year, as soon as the scoring error on some of the English I, II and Integrated Math II EOCs was identified, scores were updated and all TNReady tests were re-reviewed and verified for full accuracy.”
Some teachers told us that, given the delay in score deliveries this spring, many students don’t think the results will arrive in time to affect their final grades next spring. Those teachers are struggling to get their students to buy in.
- “After two years of TNReady, it still hasn’t counted for my students. Going into year three, I will once again tell them with a hopeful, straight face that it will count as part of their report card grades and implore them to try their best. I quietly wonder what reason they have to believe me, given recent history.” —Mike Stein
- “I struggle to get students to buy in to the importance of trying their best on state tests because the students are confident that the scores won’t come back in time to affect their grades (which has been the situation for several years now). The students see zero incentive for doing well.” —Nicole Mayfield
TDOE: “We believe that if districts and schools set the tone that performing your best on TNReady is important, then students will take the test seriously, regardless of whether TNReady factors into their grade. We should be able to expect our students will try and do their best at any academic exercise, whether or not it is graded. This is a value that is established through local communication from educators and leaders, and it will always be key to our test administration. We believe that when we share these messages and values — celebrating the variety of accomplishments our students have made, taking advantage of TNReady’s scheduling flexibility to minimize disruption, focusing on strong standards-based instruction every day, sending positive messages around the importance of the variety of tests that students take, and sharing that students should always do their best — then students will buy-in and TNReady will be successful.”
Other teachers asked what happens to writing scores for tests in English language arts.
- “I can tell you that two years ago — when we first piloted the new writing test online — districts received not only every student’s scores (broken down by each of the four indicators) but also the actual student responses to each prompt. In my former district our supervisor shared them, and we analyzed them as a department. If you check with your principal, VP, or supervisors, there are some published “anchor papers” with scores available on edtools from this past year. It’s not a lot, but it’s more than we’ve had in the past. My hope is that if online continues, we’ll keep seeing the student responses in the future.” —Wj Gillespie II
TDOE: “The question appears to be referencing the process we had through the 2014-15 school year, when our writing assessment was separate. Since 2015-16, students’ writing responses on TNReady have been incorporated as part of their overall ELA score. Responses are scored based on our writing rubrics, and for educators, we have provided access to the “anchor papers” from the 2016-17 year, so they can see how students’ responses were scored based on the writing rubric, which can help them inform the feedback they give their students.”
On that same issue of writing scores, one teacher referenced the hiring of scorers off of Craigslist. We asked the state if that’s true.
- “I continue to be curious about our ELA writing scores. Each year we are required to use state writing rubrics, attend PD related to the state’s four types of writing, etc etc…and yet our scores never come back. Students spend hours taking the writing portion of the test, scorers are hired off Craig’s list…, and yet we never actually get the scores back. It seems like every year this is swept under the rug. Where do these writing tests go?” —Elizabeth Faison Clifton
TDOE: “Questar does not use Craigslist. Several years ago, another assessment company supposedly posted advertisements on Craigslist, but Questar does not. We provide opportunities for our educators to be involved in developing our test, and we also encourage Tennessee teachers to apply to hand-score TNReady. To be eligible, each applicant must provide proof of a four-year college degree, and preference is given to classroom teachers. As part of the interview process, an applicant would have to hand-score several items for review and evaluation. Once hired, each scorer is trained based on materials that Tennessee teachers and the department approve — and which are assembled from responses given by Tennessee students on the exam — and scorers are regularly refreshed and “recalibrated” on scoring guidelines. Each writing response is scored at least twice; if those two responses differ significantly, they are sent to a third scorer. Each day, the department reads behind a sample of essays to ensure hand-scorers are adhering to the criteria set by our teachers. Any scores that do not align are thrown out, and those scorers are retrained. Any scorer who does not meet our and Questar’s standards is released from scoring TNReady.”
Finally, readers expressed a lot of concern about the complexity behind growth scores known as TVAAS, which are based on TNReady results and which go into teachers’ evaluations. We asked the state for a simple explanation.
- “What formula is used in calculating the overall score for TVAAS when fallacies were determined as a result? My performance is weighed heavily on the state TVAAS score which is why this type of error has occurred before. This is quite disturbing. Teachers work tirelessly to ensure student achievement is a success; however, testing to measure performance seems to not be working.” —Mildred Williams
- “No one can give me the formula for how my students’ scores are calculated to create my score in TVAAS. How is (t)hat transparency? Yet, I’m required, constantly, to “prove” myself with documentation of education, observations, professional development and the like; all in originals, of course, to numerous overseeing bodies.” —Rachel Bernstein Kannady
- “I find it ludicrous that data from these tests are used to evaluate MY performance when I get little to no control over most of the variables regarding the test. How could a miscalculated, misinformed, and (for all I know) incomprehensible test demonstrate what my students have learned!? And don’t even get me started on that fact that the rigor of the tests was increased ten-fold, yet not scaffolded in.” —Nicole Mayfield
TDOE: “TVAAS is statistically valid and reliable, and we follow the recommendations outlined by the American Educational Research Association (AERA) on value-added measures. Conceptually, TVAAS looks at how students have performed historically on TCAP and TNReady and compares their performance to their peers who have had similar past performance. If students tended to grow at about the same rate as their peers across the state — the expected amount of growth — they would earn a 3. If students tended to grow faster than their peers, they would earn a 4 or a 5, depending on the amount of progress they showed. If they tended to not show as much growth as their peers, they would earn a 1 or a 2. The model itself is sophisticated and complex to be as fair and nuanced as possible for each teacher’s situation, and we are working with our educator preparation providers as well as district leaders to provide more training on specifically how the model calculates scores. Tennessee educators also have access to a TVAAS user support team that can answer any specific questions about their TVAAS data, including how the data was analyzed.
Because TVAAS always looks at relative growth from year to year, not absolute test scores, it can be stable through transitions — and that is what we saw this year. Students can still grow, even if their overall proficiency level is now different. You can think about it like a running race. If you used to finish a 5K at about the same time as 10 other students, and all 10 students made the same shift to a new race at the same time with the same amount of time to prepare, you should finish the new race at about the same time. If you finished ahead of the group’s average time, you grew faster than your peers. If you lagged behind everyone, that would indicate you did not grow as much as was expected. Because students’ performance will be compared to the performance of their peers and because their peers are making the transition at the same time, drops in statewide proficiency rates resulting from increased rigor of the new assessments had no impact on the ability of teachers, schools, and districts to earn strong TVAAS scores. Transitions to higher standards and expectations do not change the fact that we still want all students in a district to make a full year’s worth of growth, relative to their peers who are all experiencing the same transition.”
Reporter Laura Faith Kebede contributed to this report.