digital drop

The national test of students’ progress has gone digital. A state leader is raising questions about what that means.

PHOTO: Marc Piscotty

The release of “the nation’s report card” on April 10 will be a big deal. The scores put a spotlight on the academic performance of all 50 states and many big school districts, and inevitably lead to jockeying about what the numbers mean for education policy.

That’s why it’s also a big deal when a state leader raises questions about the National Assessment of Educational Progress — which Louisiana Superintendent John White is doing now in the lead-up to the scores’ release.

In a March 23 letter to the National Center for Education Statistics, which administers the NAEP tests, White said he was concerned that this year’s switch from paper-and-pencil exams to computer-based exams might unfairly penalize some states. He called on NCES, a branch of the U.S. Department of Education, to release more information about the issue.

“I would like to be assured,” White wrote, that “the results and trends reported at the state level reflect an evaluation of reading and math skill rather than an evaluation of technology skill.”

Peggy Carr, the acting head of NCES, told Chalkbeat that she does intend to release the information White is requesting, and that the testing group has made extensive efforts to ensure that comparisons are valid during the transition to computer-based testing.

“We did the best of the best in terms of how we executed it,” Carr said of the organization’s study of the print and digital test results. “That is what I will share with the states.”

Still, White’s letter is likely to attract attention because he is among the best-known state schools chiefs — and because his request for more information was also backed by a letter from the Council of Chief State Schools Officers, which represents all state education leaders.

The digital-test dip

Students tend do tend to do worse on exams taken on a computer or a tablet than on one taken with pencil and paper. States have been finding this out on their own yearly tests, including the PARCC.

For the first time in 2017, most students took the NAEP tests digitally. A small number of students continued to take the test on paper, allowing officials to adjust for differences caused by the test-taking mode. The NAEP results are even being released later this time around because that analysis took extra time.

Carr won’t talk about this year’s results yet, and wouldn’t say whether NCES found that taking the test on a tablet affected students’ scores. But she noted that when tablets first started being used, in a 2015 pilot phase, NCES did see such a phenomenon, and that it’s common in educational testing.

“Everyone finds a mode effect when they go from paper and pencil to [digital],” said Carr, using the technical phrase for how different test-taking methods affect student performance.

White’s issue is with how NCES addresses the score dip that comes with the digital tests.

In his letter, White suggests that NCES is making the same adjustment for every student. That might not make sense, he argues. Louisiana and Massachusetts students, for example, have different levels of exposure to technology. Different states and groups of students might need different adjustments.

“No Louisiana student in 4th grade or 8th grade had ever been required to take a state assessment via a computer or tablet as of the 2017 NAEP administration,” White wrote. “This fact, coupled with a variety of social indicators that may correspond with low levels of technology access or skill, may mean that computer usage or skill among Louisiana students, or students in any state, is not equivalent to computer skills in the national population.”

It won’t be clear if White’s concerns have merit until NCES releases its state-by-state analysis.

Andrew Ho, a professor at Harvard and member of the NAEP governing board, said that such questions could be legitimate. For a state truly to be unfairly penalized, though, their students would have to respond differently than other students nationally with similar demographics.

The Louisiana angle

It’s worth noting that there are political incentives for leaders to raise questions about results that don’t make their state look good.

The 2017 NAEP results have been shared with state testing officials, who are instructed not to discuss the results publicly until their wider release. White — who has highlighted gains his state had made on NAEP between 2009 and 2015 — said he couldn’t comment on Louisiana’s performance in the latest round of tests.

Even though researchers warn that it is inappropriate to judge specific policies by raw NAEP results, if White’s letter is a signal that Louisiana’s scores have fallen, that could deal a blow to his controversial tenure, where he’s pushed for vouchers and charter schools, the Common Core, letter grades for schools, and an overhaul of curriculum.  

White said his state’s results are not what’s driving his concerns.

“I doubt that any mode effect would have radically vaulted Louisiana to the top or dropped Louisiana further below,” he said. “The issue is from a national perspective.”

Carr said she plans to respond to White’s letter in writing.



Mapping a Turnaround

This is what the State Board of Education hopes to order Adams 14 to do

PHOTO: Hyoung Chang/The Denver Post
Javier Abrego, superintendent of Adams 14 School District on April 17, 2018.

In Colorado’s first-ever attempt to give away management of a school district, state officials Thursday provided a preview of what the final order requiring Adams 14 to give up district management could include.

The State Board of Education is expected to approve its final directives to the district later this month.

Thursday, after expressing a lack of trust in district officials who pleaded their case, the state board asked the Attorney General’s office for advice and help in drafting a final order detailing how the district is to cede authority, and in what areas.

Colorado has never ordered an external organization to take over full management of an entire district.

Among details discussed Thursday, Adams 14 will be required to hire an external manager for at least four years. The district will have 90 days to finalize a contract with an external manager. If it doesn’t, or if the contract doesn’t meet the state’s guidelines, the state may pull the district’s accreditation, which would trigger dissolution of Adams 14.

State board chair Angelika Schroeder said no one wants to have to resort to that measure.

But districts should know, the state board does have “a few more tools in our toolbox,” she said.

In addition, if they get legal clearance, state board members would like to explicitly require the district:

  • To give up hiring and firing authority, at least for at-will employees who are administrators, but not teachers, to the external manager.
    When State Board member Steve Durham questioned the Adams 14 school board President Connie Quintana about this point on Wednesday, she made it clear she was not interested in giving up this authority.
  • To give up instructional, curricular, and teacher training decisions to the external manager.
  • To allow the new external manager to decide if there is value in continuing the existing work with nonprofit Beyond Textbooks.
    District officials have proposed they continue this work and are expanding Beyond Textbooks resources to more schools this year. The state review panel also suggested keeping the Beyond Textbooks partnership, mostly to give teachers continuity instead of switching strategies again.
  • To require Adams 14 to seek an outside manager that uses research-based strategies and has experience working in that role and with similar students.
  • To task the external manager with helping the district improve community engagement.
  • To be more open about their progress.
    The state board wants to be able to keep track of how things are going. State board member Rebecca McClellan said she would like the state board and the department’s progress monitor to be able to do unannounced site visits. Board member Jane Goff asked for brief weekly reports.
  • To allow the external manager to decide if the high school requires additional management or other support.
  • To allow state education officials, and/or the state board, to review the final contract between the district and its selected manager, to review for compliance with the final order.

Facing the potential for losing near total control over his district, Superintendent Javier Abrego Thursday afternoon thanked the state board for “honoring our request.”

The district had accepted the recommendation of external management and brought forward its own proposal — but with the district retaining more authority.

Asked about the ways in which the state board went above and beyond the district’s proposal, such as giving the outside manager the authority to hire and fire administrative staff, Abrego did not seem concerned.

“That has not been determined yet,” he said. “That will all be negotiated.”

The state board asked that the final order include clear instructions about next steps if the district failed to comply with the state’s order.

Indiana A-F grades

Why it’s hard to compare Indianapolis schools under the A-F grading system

PHOTO: Dylan Peers McCoy
Because Thomas Gregg Neighborhood School became an innovation school last year, the state uses a different scale to grade it.

A-F grades for schools across Indiana were released Wednesday, but in the state’s largest district, the grades aren’t necessarily an easy way to compare schools.

An increasing share of Indianapolis Public Schools campuses, last year about 20 percent, are being measured by a different yardstick than others, creating a system where schools with virtually identical results on state tests can receive vastly different letter grades.

The letter grades aim to show how well schools are serving students by measuring both how their students score on state tests and how much their scores improve. But as Chalkbeat reported last year, new schools and schools that join the IPS innovation network can opt to be graded for three years based only on the second measure, known as growth. Schools in the innovation network are part of the district, but they are run by outside charter or nonprofit operators.

Of the 11 out 70 Indianapolis Public Schools campuses that received A marks from the state, eight were graded based on growth alone. They included a school in its first year of operation and seven innovation schools.

At the same time, traditional neighborhood and magnet schools with growth scores as good as or better than the scores at A-rated innovation schools received Bs, Cs, and even Ds.

Of the 13 innovation schools that received grades for last school year, eight received As, two got Bs, two got Cs, and one got a D. Only Herron High School was graded on the same scale as other schools. (For high schools, grades incorporate other measures including graduation rates.)

The result is a system that most parents don’t understand, said Seretha Edwards, a parent of four children at School 43, a school that received a failing grade from the state but would have gotten a B if it were measured by growth alone.

“I just think it’s kind of deceiving,” she added. “I don’t think it paints a fair picture of the schools.”

Indianapolis Public Schools deputy superintendent for academics Aleesia Johnson said the growth scores show schools are on a good trajectory.

“If you see that kids are making progress in terms of growth, that’s a good sign that you’re on the right track,” she said.

Still, she acknowledged that “there’s still a lot of work to do” to get students to pass tests and show proficiency.

Johnson pointed out that often-changing standardized tests and different A-F grades can cause confusion for families, and those measures don’t provide a complete or timely picture for families who want to assess their schools or choose new ones. “I don’t think it gives a lot of valuable information,” she said.

Advocates have said the growth only model makes sense because schools shouldn’t be held accountable for the low passing rates of students that they just began educating. But in practice, the policy benefits charter and innovation schools, which enjoy strong support from Republican lawmakers.

“The concept behind the growth-only model was that we measured newer schools based off of what they are able to do for their students, rather than taking them where they received them,” said Maggie Paino, the director of accountability for the education department. “You’re taking strides to get toward proficiency.”

The situation is even more muddled than usual this year. Schools across the state received two letter grades. One was calculated under a state model that relies largely on test scores, and the other was determined under a plan the state uses to comply with federal standards.

In addition to helping parents choose schools, years of repeated low letter grades from the state can trigger intervention or takeover. But the state has deferred in decisions about intervening in low-rated schools to IPS in recent years.

Back in 2012, the state took over four chronically low-performing Indianapolis schools. Since Superintendent Lewis Ferebee took over, IPS has taken aggressive steps to overhaul struggling schools by “restarting” them as innovation schools with new managers. Other struggling schools have been closed.

School 63, which received its sixth consecutive F from the state, might have faced state intervention in the past. But the school is unlikely to face repercussions because IPS restarted the school by turning it over to an outside manager. The Haughville elementary school is now managed by Matchbook Learning.

Shaina Cavazos and Stephanie Wang contributed reporting.