In Newark, a new high-stakes exam that every eighth-grader will take next month is raising questions — and not the multiple-choice kind.

The district has been rushing to complete the test, which the superintendent revealed at a parent conference in December, so that students can take it next month. It will determine, along with other factors, which students are admitted to the city’s coveted magnet schools in the fall.

The implications for students’ futures are huge: Students who attend magnets are far more likely than their peers in traditional high schools to graduate and earn college degrees. Yet officials have said little publicly about who’s developing the test, how schools will use it, or even why it’s necessary.

“Why make kids take an additional test unless there’s a very good reason for it,” said Jonathan Taylor, a research analyst at Hunter College in New York City who has studied that city’s controversial high-school entrance exam. “They already have to take enough of these tests.”

The exam appears to be aimed at making Newark’s magnet schools more selective, based on Superintendent Roger León’s belief that they have been admitting some students who are under-qualified or insufficiently interested in the schools’ themes.

“The whole concept that anyone and everyone can get into the magnet high schools — that’s not why they were designed,” León said last month. “You actually have to qualify to get into those schools.”

But the data show that Newark’s six magnet schools already enroll the city’s highest-achieving students (based on state test scores), and very few students who are still learning English or have disabilities, who tend to perform less well on standardized tests. León’s move comes as New Jersey’s governor tries to cut back on high-stakes testing, and the mayor of nearby New York City has proposed eliminating a similar gatekeeping test entirely.

Newark’s “magnet schools are definitely enrolling different kids — fewer English language learners, fewer special-education students, fewer students who eligible for free lunch,” said Christopher Tienken, a professor of education administration at Seton Hall University. “I think they should actually be looking at measures that increase the diversity of the schools.”

Beyond the purpose of the test, experts also raised questions about its design, as the district appears to be creating it in-house with the help of local educators. Tests used for high-stakes purposes such as admissions decisions are expected to meet accepted standards and be carefully vetted.

“The test should be validated before using it,” Taylor said. “You don’t have a test, start admitting kids, and then look to see 30 years later, ‘Well, gee, is this a valid test?’”

Officials may have answers to these questions, but they haven’t yet shared them publicly. The district did not respond to questions for this story or make any officials available for comment.

For now, Chalkbeat has rounded up everything we know about the test so far. And we spoke with experts to find out the best practices when introducing a new exam like this — along with the risks.

What’s on the test?

The test will measure students’ English and math skills.

The district is creating the exam itself, with assistance from principals and other school staffers. The educators helped write math questions and shared essay prompts their schools have used to assess students’ reading and writing skills, said Carla Stephens, principal of Bard High School Early College–Newark.

León “had many calls and meetings where he has been consulting with the magnet school principals about the test,” said Stephens, who shared her magnet school’s writing prompt with the district.

The test will also try to assess students’ interest in each magnet school’s unique focus, which includes science, history, and the arts. León, who attended a magnet school, has said he believes some magnet students lack a real passion for the schools’ themes.

“Magnet high schools were always supposed to be a high school where students had an interest in what that specialty was,” he said. “We want the admissions test is to be able to gauge that.”

Who will take it?

All Newark Public Schools eighth-graders will sit for the test on Friday, Feb. 15.

They will do so in their own schools during the school day — a decision that experts applauded. Other districts have been criticized for administering high-school and college admissions tests on the weekends, making it hard for some students to participate.

All non-district students hoping to attend to a magnet school will take the test on Saturday, Feb. 16.

How will schools use it?

Magnet schools will consider the exam scores as they review and rank applicants. The rankings will determine which students are admitted for the 2019-20 academic year.

The scores will be considered alongside factors the schools already look at, including students’ grades, state test scores, and attendance records. Some schools also interview applicants and assess their writing and math skills, while Arts High School holds auditions.

Experts endorsed the district’s decision to have the schools use the exams in conjunction with other admissions criteria, which they said provides a more complete picture of each applicant.

However, school staffers said they have not been told how much weight each of those factors will be given — and whether schools or the district will decide that.

At one of the city’s most selective magnet schools, Science Park, parents and administrators clashed last year over how much weight should be given to applicants’ state PARCC scores. In a compromise, the school lowered their weight, making the PARCC scores count for 70 percent of applicants’ ranking. Now, Science Park and the other magnets will have to factor the new entrance exam into the mix.

Traditional high schools will also use the admissions test scores, León has said. The scores will determine eligibility for new gifted-and-talented programs that León has directed those schools to establish.

Is the test ready?

By the looks of it, not quite.

Students were originally scheduled to take the test this week, but the district quietly postponed the test until February. The district chalked that up to “logistical modifications.” But as recently as last month, officials said they were still revising the exam.

According to someone who spoke with the superintendent, part of the problem was that the school staffers who helped write the test borrowed heavily from assessments that high-school students take during the year.

“He said the questions were basically copied and pasted,” the person said. “It wasn’t good, so they had to redo it.”

And at a parent meeting this week at one of the most selective magnet schools, Science Park High School, Principal Kathleen Tierney said the test is “still being vetted,” according to an attendee.

What do experts say?

Researchers who study testing asked several pointed questions about Newark’s new high-stakes exam.

First, they wanted to know how officials decided it was necessary. They noted that the magnets already look at students’ grades and state test scores.

Taylor, the research analyst at Hunter College, pointed out that entrance exams can be less effective in identifying promising students than officials realize, calling into question their necessity. In a recent study, he found that the exam used to determine who gets into New York City’s elite “specialized” high schools was actually less predictive of students’ ninth-grade achievement than their grades or state test scores from middle school.

The experts also questioned how the district is ensuring the exam meets industry standards — especially when it’s being developed on such a tight timeline.

One key standard is that a given exam does what it’s intended to do; for instance, help a school predict which students will be able to keep up with its rigorous academic program. Another standard is that the exam is not biased against any group, such as girls or black or Hispanic students.

Some experts advised against using the test results in high-stakes admissions decisions before it has been shown to meet those standards.

“What they should be doing is piloting this test for at least one year,” said Tienken, the Seton Hall professor. “Test design is a multi-year process.”

Kurt F. Geisinger, director of the Buros Center on Testing at the University of Nebraska, said that trying out an exam on a sample of students before its full launch can help detect biases — but that can be difficult to do without having questions leak out. Either way, he said, the district should expect to continuously improve the exam.

“These things do take time to build,” he said. “You can’t expect them to be perfect year one.”