Educators question comparison of state proficiency exams to national tests

Educators question comparison of state proficiency exams to national tests

(Minerva Studio/Shutterstock)


Save Story
Leer en español

Estimated read time: 4-5 minutes

This archived news story is available only for your personal, non-commercial use. Information in the story may be outdated or superseded by additional information. Reading or replaying the story in its archived form does not constitute a republication of the story.

SALT LAKE CITY — A new report shows Utah's definition of student proficiency is similar to that of the National Assessment of Educational Progress, an exam tied to federal education dollars that is administered in all 50 states.

But the "honesty gap," as the report calls it, was much wider for other states, such as Georgia, where more than 90 percent of students scored proficiently on their statewide exam but less than 40 percent scored proficiently on the national exam, called NAEP.

It's a measure indicating bloated proficiency standards aren't at issue in Utah. But educators here do question the premise of comparing Utah's statewide exam, known as SAGE, to a national assessment lauded by the report as "the gold standard of student assessment" that uses different testing methods and different academic standards altogether.

"Any assessment that isn't aligned to a specific state's standards, you're not going to see a close alignment with the outcomes of that assessment," said David Crandall, chairman of the Utah State Board of Education. "It's not something that we completely ignore, but we have to be kind of cautious as to how much stock we put into the results of (NAEP)."


It really just has to do with the change in definition of the word proficient and what we mean by that.

–David Crandall, state office of education


The report released Thursday by Achieve, a national education reform organization, lists Utah as one of a handful of "truth tellers" whose state proficiency rates are within 15 percent of their NAEP rates. Last year, Utah's gap was 5 percent or less in various metrics, according to the report.

But Utah's "honesty gap" wasn't always paper-thin. Prior to SAGE, Utah's proficiency discrepancy was as high as 51 percent for eighth-grade reading and 41 percent for fourth-grade reading. Math scores showed less of a gap, at 38 percent for eighth-graders and 35 percent for fourth-graders, according to the report.

Utah's adoption of SAGE, which, unlike the prior assessment, is aligned with the new Utah Core Standards for math and English, raised the bar for students and caused proficiency rates to approach those of NAEP.

"It really just has to do with the change in definition of the word proficient and what we mean by that," Crandall said. "We changed the definition of proficient under SAGE, and to no one's surprise, the percentage of proficient students decreased. It didn't have anything to do with their performance, per se."

NAEP, sponsored by the U.S. Department of Education since 1969, is a paper-and-pencil test administered periodically to a statistical sample of fourth-, eighth- and 12th-grade students in every state. The results are used to form a national report card of America's academic achievement.

SAGE, or student assessment of growth and excellence, is Utah's yearly computer-adaptive exam administered for the first time last year to almost every student in third through 12th grade.

Related

While the report somewhat illustrates Utah's transition from its former assessment to SAGE, some say comparing tests is a fruitless endeavor. Sharon Gallagher-Fishbaugh, president of the Utah Education Association, said even though SAGE produced roughly the same percentage of proficient students as NAEP, the tests are far from compatible.

"NAEP measures completely different things in a completely different manner with a completely different set of students," Gallagher-Fishbaugh said. "SAGE is a very different assessment from NAEP, so I don't know how they can correlate."

She added that the newness of Utah's academic standards, technical setbacks while administering the test, and having to administer the test months before the end of the school year all call into question the reliability of SAGE's first year of data.

Such issues contributed to a debate during the legislative session this year as lawmakers considered eliminating the exam because of its shortfalls and because of how it is used to evaluate schools and teachers.

Gallagher-Fishbaugh said SAGE has "great potential" to inform instruction, but it will take a few years to iron out logistical problems and accumulate enough data before educators will be more confident in using the exam to guide teaching practices.

"I think what we need to do is quit spending so much time and effort on worrying about assessments and start looking at the kind of learning environments we have in schools, the kind of best practices we know will make a difference," she said. "We need to back away from trying to compare (assessments) and start looking at conditions for learning in schools."

Related stories

Most recent Utah stories

Related topics

UtahEducation
Morgan Jacobsen

    STAY IN THE KNOW

    Get informative articles and interesting stories delivered to your inbox weekly. Subscribe to the KSL.com Trending 5.
    By subscribing, you acknowledge and agree to KSL.com's Terms of Use and Privacy Policy.

    KSL Weather Forecast