Growing pains

Three possible explanations for why students with special needs didn’t fare as well on PARCC

When state officials this week released new data showing how much students had grown academically year-to-year on state tests, one statistic jumped out.

The gap separating students with special needs from other students had grown dramatically, leaving educators and advocates searching for answers.

Colorado’s student growth report calculates how much students learn year-to-year compared to students who start in a similar place academically. Students with special needs not only lag behind other students, but this year’s data showed they are learning at a slower rate than two years ago.

Before we get to possible explanations, an important note about these students, who have individualized education plans that define goals and services each student should get.

Students have these plans for a variety of reasons. They include speech impediments, attention-deficit/hyperactivity disorder and emotional disabilities. A very small number have cognitive disabilities, experts say.

That means the overwhelming majority of students with such plans should be just as likely to score well on standardized tests as their peers without special needs if they have the right help, experts say.

While state officials and experts we spoke with are concerned about this year’s results, most cautioned that it was too early to reach firm conclusions or know whether this is the start of a trend or an outlier.

Here are some possible explanations for the growing growth gap for students with special needs:

Students with individualized education plans may not have enough access to inclusive classrooms with the critical thinking they need to do well on tests (and in life).

With the adoption of the Common Core State Standards in English and math, teachers and students have been asked to make fundamental changes in the classroom.

A greater emphasis has been placed on critical thinking over rote memorization. Students are being asked to read longer and denser passages and cite evidence in written responses.

Students with disabilities aren’t getting that opportunity, experts say.

“Too often kids with disabilities just don’t have the opportunity to learn,” said Sheryl Lazarus, a senior research associate with the National Center for Education Outcomes, which focuses on underserved students. “The reading and writing (on the assessments) were real challenges. Students need the opportunity to learn the grade-level content. Once they do that, they’ll do much, much better on these assessments.”

A report Lazarus co-authored surveyed teachers in states that used the PARCC exams or another multi-state test, Smarter Balanced. It found:

  • Students with special needs were not used to reading long passages like those found on the tests.
  • Those same students were not used to writing extended responses and lacked basic computer skills.
  • They also had difficulty using evidence to justify their answers and lacked basic research skills.

Angela Denning, the state education department’s special education chief, said state monitoring found only about 60 percent of students with special needs spent 80 percent or more of their time in classrooms with the general student population.

That’s not enough, she said.

“My bet is that schools with small or no growth gaps have students with disabilities receiving good instruction in those core areas in the regular education classrooms” with help and instructional strategies tailored for them, she said.

Denver Public Schools, which has one of the largest growth gaps between students with disabilities and other students, is focusing more on including all students in regular classroom work. Ten schools are part of a new pilot program seeking to better incorporate students with disabilities in general classrooms.

The district is also training teachers to write better learning plans for students with special needs to include more data, and goals for improvement and meeting academic standards.

“We need to have our results translate to all kids,” said Josh Drake, DPS’s executive director for exceptional students.

Pam Bisceglia, a coordinator for AdvocacyDenver, which champions the rights of students with special needs, said both special education and general education teachers need more cross-training on how to better meet the needs of students regardless of what classroom they are in.

“There always has to be a shared responsibility to meeting kids’ needs,” she said.

While the new computer-based state tests have features meant to put students with special needs on a level playing field, that doesn’t mean students and teachers know how to use them.

The PARCC exams, which are mostly taken on computers, come with plenty of bells and whistles. A 200-plus page manual describes in detail what can be done to help students, including larger fonts, having passages read aloud and more.

“PARCC has tons of stuff built in for accommodations, but that doesn’t mean that’s better,” said Ann Morrison, an associate professor at the School of Education at the Metropolitan State University of Denver. “What we should look for is high degrees of ease of use … My sense about PARCC is that there is not an ease of use.”

Teacher and student frustrations with technology could put the results in question, Morrison said.

“Anxiety gets in the way of learning and demonstrating learning,” she said.

A PARCC spokeswoman said the group tests the tools used by students and is adding new ones. In 2016, PARCC included a function that allowed math problems to be read aloud in both English and Spanish, and in 2017 PARCC will offer a Braille version of the test.

Bisceglia said that while there was some confusion about how schools provided accommodations to students during PARCC’s first year, she heard of no complaints this year.

Kids with special needs opted out of the tests at a higher rate than their peers.

Colorado’s PARCC scores have been called into question because of the large number of students choosing not to take the tests in higher grades — mostly in high-performing schools.

Opt-out rates also are slightly higher for students with individualized education plans, state data show.

Derek Briggs, director of the University of Colorado Boulder’s Center for Assessment, Design, Research and Evaluation, suggested that within that group, students more likely to score well were the ones who skipped out.

“It’s a relatively small group,” Briggs said about the number of students with education plans. “It doesn’t take that much [to skew results].”

Denning, the state’s special education chief, said she’s asking an advisory council of parents and educators to examine why opt-out numbers are higher in the special education community.

more tweaks

For third straight year, TNReady prompts Tennessee to adjust teacher evaluation formula

PHOTO: Grace Tatter
Education Commissioner Candice McQueen announced last April that she was suspending TNReady testing for grades 3-8 for the 2015-16 school year. Now, her department is asking lawmakers to make more adjustments to the weight of student test scores in Tennessee's teacher evaluation formula.

First, Tennessee asked lawmakers to make temporary changes to its teacher evaluations in anticipation of switching to a new test, called TNReady.

Then, TNReady’s online platform failed, and the state asked lawmakers to tweak the formula once more.

Now, the State Department of Education is asking for another change in response to last year’s test cancellation, which occurred shortly after the legislative session concluded.

Under a proposal scheduled for consideration next Monday by the full House, student growth from TNReady would count for only 10 percent of teachers’ evaluation scores and 20 percent next school year. That’s compared to the 35 to 50 percent, depending on the subject, that test scores counted in 2014-15 before the state switched to its more rigorous test.

The bill, carried by Rep. Eddie Smith of Knoxville, is meant to address teachers’ concerns about being evaluated by a brand new test.

Because testing was cancelled for grades 3-8 last spring, many students are taking the new test this year for the first time.

“If we didn’t have this phase-in … there wouldn’t be a relief period for teachers,” said Elizabeth Fiveash, assistant commissioner of policy. “We are trying to acknowledge that we’re moving to a new assessment and a new type of assessment.”

The proposal also mandates that TNReady scores count for only 10 percent of student grades this year, and for 15 to 25 percent by 2018-19.

The Tennessee Education Association has advocated to scrap student test scores from teacher evaluations altogether, but its lobbyist, Jim Wrye, told lawmakers on Tuesday that the organization appreciates slowing the process yet again.

“We think that limiting it to 10 percent this year is a wise policy,” he said.

To incorporate test scores into teacher evaluations, Tennessee uses TVAAS, a formula that’s supposed to show how much teachers contributed to individual student growth. TVAAS, which is short for the Tennessee Value-Added Assessment System, was designed to be based on three years of testing. Last year’s testing cancellation, though, means many teachers will be scored on only two years of data, a sore point for the TEA.

“Now we have a missing link in that data,” Wrye said. “We are very keenly interested in seeing what kind of TVAAS scores that are generated from this remarkable experience.”

Although TVAAS, in theory, measures a student’s growth, it really measures how a student does relative to his or her peers. The state examines how students who have scored at the same levels on prior assessments perform on the latest test. Students are expected to perform about as well on TNReady as their peers with comparable prior achievement in previous years. If they perform better, they will positively impact their teacher’s score.

Using test scores to measure teachers’ growth has been the source of other debates around evaluations.

Historically, teachers of non-tested subjects such as physical education or art have been graded in part by schoolwide test scores. The House recently passed a bill that would require the state to develop other ways to measure growth for those teachers, and it is now awaiting passage by the Senate.

 

deja vu

Last year, Ritz’s computer-based testing plan was largely dismissed. Today, McCormick adopted part of it as her own.

PHOTO: Shaina Cavazos
Glenda Ritz and Jennifer McCormick debated in Fort Wayne during the 2016 campaign this past fall.

Although she wasn’t on board with former-state Superintendent Glenda Ritz’s entire testing plan during last year’s campaign, current Indiana schools chief Jennifer McCormick today expressed support for a computer-based test format Ritz lobbied hard for during her last year in office.

These “computer-adaptive” exams adjust the difficulty-level of questions as kids get right or wrong answers. McCormick explained the format to lawmakers today when she testified on the “ILEARN” proposal that could replace the state’s unpopular ISTEP exam if it becomes law.

Computer-adaptive technology, she said, allows tests to be more tailored around the student. Test experts who spoke to Indiana policymakers this past summer have said the tests also generally take less time than “fixed-form” tests like the current ISTEP and could result in quicker turnaround of results.

During the summer, members of a state commission charged with figuring out what Indiana’s new testing system could look like largely argued against this testing format, including the bill’s author, Rep. Bob Behning, R-Indianapolis. At the time, he said he was concerned about investing in a technology-heavy plan when much of the state struggles to get reliable internet and computer access. Today, Behning didn’t speak against the concept.

Overall, McCormick was supportive of House Bill 1003, but she pointed out a few areas that she’d like to see altered. More than anything, she seemed adamant that Indiana get out of the test-writing business, which has caused Hoosiers years of ISTEP-related headaches.

Read: Getting rid of Indiana’s ISTEP test: What might come next and at what cost

“Indiana has had many years to prove we are not good test-builders,” McCormick told the Senate Education Committee today. “To continue down that path, I feel, is not very responsible.”

The proposed testing system comes primarily from the recommendations of the state commission. The biggest changes would be structural: The bill would have the test given in one block of time at the end of year rather than in the winter and spring. The state would go back to requiring end-of-course assessments in high school English, Algebra I and science.

The bill doesn’t spell out if the test must be Indiana-specific or off-the-shelf, and McCormick suggested the state buy questions from existing vendors for the computer-adaptive test for grades 3-8, which would have to be aligned with state standards.

For high school, McCormick reiterated her support for using the SAT and suggested making the proposal’s end-of-course assessments optional.

The ILEARN plan, if passed into law, would be given for the first time in 2019.

“Spring of 2019 is a more realistic timeline no matter how painful it is for all of us.” McCormick said. “We could do it for (2018), but it might not be pretty. We tried that before as a state, and we couldn’t get it right.”

You can find all of Chalkbeat’s testing coverage here.