data dump

Colorado state test scores inch up, but wide socioeconomic gaps remain

PHOTO: Helen H. Richardson/The Denver Post
Helen H. Richardson/The Denver Post

Three years after Colorado introduced new, more demanding standardized tests, student performance statewide is slowly ticking up, according to data released Thursday.

Most students still are falling well short of meeting the state’s expectations on the PARCC math and English tests, which are meant to measure whether students are on track to be prepared for life after high school.

But state officials applauded progress: 42 percent of students who took the tests last spring met the state’s learning goals in English, and 33 percent met them in math. That’s an increase of about 2 percentage points in both subjects since 2015, the first year the tests were given.

The state’s poorest students continue to academically lag behind their more affluent peers by wide margins. The gaps remain wide — some as large as 30 percentage points — and are generally not tightening because all students are making progress at about the same rate.

Only 27 percent of Colorado fourth-graders who qualify for subsidized meals at school met grade-level expectations on the English test, while 58 percent of their more affluent peers made the grade.

“We are pleased to see performance improvements by so many students across Colorado, and we know this only comes after a lot of hard work and dedication from educators, parents and students,” Katy Anthes, the state’s education commissioner, said in a statement. “At the same time, our focus on our historically disadvantaged students must remain a top priority. In too many cases, those groups are not showing gains at a pace that will allow them to catch up, so CDE will increase our focus on providing support to our districts and schools to help them with this challenge in the next few years.”

Those results were part of a trove of student testing data released Thursday by the Colorado Department of Education.

Besides achievement data from the state’s English and math tests, the department also released results from its science and social studies tests, and the PSAT and SAT tests that high school sophomores and juniors take. Additionally, the state released student growth data, which measures how much students learn during an academic year compared to other students who scored similarly to them on tests the previous year.

Results for individual students are shared with families, and collectively the state uses them to rate school quality. Some districts use the results in evaluating teachers — one reason the tests are controversial.

About 555,000 students between the third and 11th grades took state tests last spring.

On PARCC, participation rates ticked up slightly and ranged from 96.4 percent in the third grade to 76 percent in the ninth grade statewide. Since Colorado began giving the exams in 2015, schools especially in affluent suburbs and rural areas have struggled to meet a federal requirement of testing 95 percent of their students.

This year’s results were released earlier than in past years, and more data was released at one time. One criticism of PARCC has been how long it’s taken for results to be available.

Data transparency activists, however, are sure to cringe at array of school level results that won’t be made public due to ongoing concerns about student privacy. More than 20 percent of the results released from PARCC exams were redacted to ensure the public cannot identify an individual student’s results.

The state does this by following a complex set of rules that is set off if fewer than 16 students at a school score in a particular range. Before the state adopted these rules, it would only redact results if fewer than four students had the same score at a school.

Find your school’s PARCC scores
Search for your school’s PARCC scores in Chalkbeat’s database here.

“The new tests were supposed to provide better information about what is working and now we know far less,” said Van Schoales, CEO of A+ Colorado, an education watchdog group. “It’s outrageous that CDE has arbitrarily hidden so much of the achievement data making it difficult to know whether schools or districts are working. Only through knowing what works will Colorado educators be able to improve our schools.”

There are other limitations to what the state releases. Ninth-graders can take PARCC math tests of varying degrees of difficulty. That, along with lower student participation rates on 9th grade tests, make comparisons next to impossible. This will be the last year that issue arises: This spring year, all 9th graders will take a version of the PSAT.

In fact, Colorado is beginning a transition away from PARCC tests in all grades starting this year.

District achievement results

Officials in the state’s largest school district, Denver Public Schools, were celebrating its positive test results.

The 92,000-student district, which serves a majority of low-income students, inched closer to meeting state averages on the tests. The number of students who met the state’s proficiency bar on the state’s English test climbed in every grade. Math results were more mixed. Scores went up on six of the state’s 11 tests.

Aurora Public Schools, the only school district at risk of facing state intervention next year if its quality rating doesn’t improve, showed increases in the number of students meeting or exceeding expectations on several tests across multiple grades including big increases for eighth-grade English tests and fifth-grade math.

But among the state’s ten largest school districts, Aurora continued to post the lowest scores. For example, only 25 percent of fourth graders in the 41,000-student district met the state’s expectations on the English test.

Which kids took which test?
Third through ninth graders took the PARCC English and math tests; fifth, eighth and 11th graders took the state’s science test. And fourth and seventh graders from sampled schools took the state’s social studies exam. Tenth graders for the second year took the PSAT 10 and 11th graders took the SAT as the state’s college entrance exam for the first time.

Progress was also mixed at school districts that serve large at-risk student populations and have a history of chronic low performance on state exams.

More detailed district and school-level data is expected within a month that will detail achievement gaps between different student groups, state officials said.

Growth

A student’s growth percentile, which ranges from 1 to 99, indicates how that student’s performance changed over time, relative to students with similar performances on state assessments. Put another way, growth is calculated by measuring how students progressed compared to students who had similar scores to them on tests given a year earlier.

This data, which makes up the majority of a school’s or district’s state quality rating, helps provide a better understanding of how students are progressing, not accounting for whether they are proficient.

The state average growth score is always at the 50 percentile, so any growth score above that is considered positive. A score of 50 represents about a year’s worth of learning.

As with achievement scores, the state’s poor students are behind their more affluent peers in academic growth. Students qualifying for free or reduced-priced lunches hit the 48th percentile on English tests and the 46th percentile on math. Students that don’t qualify hit 52nd percentile on English tests and 53th percentile on math.

Students in Denver continued to post strong academic growth scores, leading the state’s five largest school districts in that measure.

Find your school’s growth scores
Search for your school’s growth scores in Chalkbeat’s database here.

“Every year for the past seven, in every subject, our kids have shown more growth than their peers across the state,” said Denver Superintendent Tom Boasberg. “This year was our best growth year ever.”

Meanwhile, students in the wealthier south suburban school district of Cherry Creek fell below the state average on growth on English tests, according to the state data. While other nearby school districts were closing growth gaps between their poor and more affluent students, the gap on English tests in Cherry Creek widened by a point.

Judy Skupa, Cherry Creek’s assistant superintendent, said the district will spend time analyzing its growth data but won’t rush to make sweeping changes based on one year of data.

“Like with anything else, it’s about the trend,” she said.

– Chalkbeat reporters Melanie Asmar and Yesenia Robles contributed

testing testing

McQueen declares online practice test of TNReady a success

PHOTO: Manuel Breva Colmeiro/Getty Images

Tennessee’s computer testing platform held steady Tuesday as thousands of students logged on to test the test that lumbered through fits and starts last spring.

Hours after completing the 40-minute simulation with the help of more than a third of the state’s school districts, Education Commissioner Candice McQueen declared the practice run a success.

“We saw what we expected to see: a high volume of students are able to be on the testing platform simultaneously, and they are able to log on and submit practice tests in an overlapping way across Tennessee’s two time zones,” McQueen wrote district superintendents in a celebratory email.

McQueen ordered the “verification test” as a precaution to ensure that Questar, the state’s testing company, had fixed the bugs that contributed to widespread technical snafus and disruptions in April.

The spot check also allowed students to gain experience with the online platform and TNReady content.

“Within the next week, the districts that participated will receive a score report for all students that took a practice test to provide some information about students’ performance that can help inform their teachers’ instruction,” McQueen wrote.

The mock test simulated real testing conditions that schools will face this school year, with students on Eastern Time submitting their exams while students on Central Time were logging on.

In all, about 50,000 students across 51 districts participated, far more than the 30,000 high schoolers who will take their exams online after Thanksgiving in this school year’s first round of TNReady testing. Another simulation is planned before April when the vast majority of testing begins both online and with paper materials.

McQueen said her department will gather feedback this week from districts that participated in the simulation.

testing 1-2-3

Tennessee students to test the test under reworked computer platform

PHOTO: Getty Images

About 45,000 students in a third of Tennessee districts will log on Tuesday for a 40-minute simulation to make sure the state’s testing company has worked the bugs out of its online platform.

That platform, called Nextera, was rife with glitches last spring, disrupting days of testing and mostly disqualifying the results from the state’s accountability systems for students, teachers, and schools.

This week’s simulation is designed to make sure those technical problems don’t happen again under Questar, which in June will finish out its contract to administer the state’s TNReady assessment.

Tuesday’s trial run will begin at 8:30 a.m. Central Time and 9 a.m. Eastern Time in participating schools statewide to simulate testing scheduled for Nov. 26-Dec. 14, when some high school students will take their TNReady exams. Another simulation is planned before spring testing begins in April on a much larger scale.

The simulation is expected to involve far more than the 30,000 students who will test in real life after Thanksgiving. It also will take into account that Tennessee is split into two time zones.

“We’re looking at a true simulation,” said Education Commissioner Candice McQueen, noting that students on Eastern Time will be submitting their trial test forms while students on Central Time are logging on to their computers and tablets.

The goal is to verify that Questar, which has struggled to deliver a clean TNReady administration the last two years, has fixed the online problems that caused headaches for students who tried unsuccessfully to log on or submit their end-of-course tests.


Here’s a list of everything that went wrong with TNReady testing in 2018


The two primary culprits were functions that Questar added after a successful administration of TNReady last fall but before spring testing began in April: 1) a text-to-speech tool that enabled students with special needs to receive audible instructions; and 2) coupling the test’s login system with a new system for teachers to build practice tests.

Because Questar made the changes without conferring with the state, the company breached its contract and was docked $2.5 million out of its $30 million agreement.

“At the end of the day, this is about vendor execution,” McQueen told members of the State Board of Education last week. “We feel like there was a readiness on the part of the department and the districts … but our vendor execution was poor.”

PHOTO: TN.gov
Education Commissioner Candice McQueen

She added: “That’s why we’re taking extra precautions to verify in real time, before the testing window, that things have actually been accomplished.”

By the year’s end, Tennessee plans to request proposals from other companies to take over its testing program beginning in the fall of 2019, with a contract likely to be awarded in April.

The administration of outgoing Gov. Bill Haslam has kept both of Tennessee’s top gubernatorial candidates — Democrat Karl Dean and Republican Bill Lee — in the loop about the process. Officials say they want to avoid the pitfalls that happened as the state raced to find a new vendor in 2014 after the legislature pulled the plug on participating in a multi-state testing consortium known as PARCC.


Why state lawmakers share the blame, too, for TNReady testing headaches


“We feel like, during the first RFP process, there was lots of content expertise, meaning people who understood math and English language arts,” McQueen said. “But the need to have folks that understand assessment deeply as well as the technical side of assessment was potentially missing.”