SALT LAKE CITY — There is growing confidence that RISE testing data is valid and can be used for school accountability purposes, state and local education officials report.
According to Assistant State Superintendent of Student Learning Darin Nielsen, three analyses of the RISE test data have been completed. The findings will be presented to the Utah State Board of Education on Oct. 3, when the board is expected to determine whether the data is defensible for state and federal school accountability.
“USBE staff is more confident now than we were last spring that the data can be used for accountability purposes. Our confidence continues to grow,” Nielsen said in an interview Friday.
The RISE assessment vendor Questar Assessment Inc. conducted one of the studies to determine the impact of service interruptions when the test was administered last spring. A third-party contractor validated the study results, Nielsen said.
Another study conducted by a third-party vendor looked at whether the RISE test required a new baseline year after the state board switched to the RISE assessment from SAGE for the 2018-19 school year. SAGE stands for Student Assessment of Growth and Excellence.
A third study of test data was conducted by State School Board staff with help from Salt Lake City and Jordan school district officials. They were given early access to the full state data file to examine district-level data and statewide results, Nielsen said.
“Both have communicated to me that they didn’t see anything odd or unusual or unexpected in their data,” Nielsen said.
Jeff Haney, spokesman for Canyons School District, said Canyons officials were told there were “only a few thousand” results that have been called into question statewide.
“The last I heard, there was no question whether we should place confidence in the validity of the test,” Haney said.
Earlier this week, the Utah Legislative Auditor released a “limited review” of RISE testing issues. It said the state board found that approximately 3,546 tests were missing scores and required participation codes.
That is less than 1% of the more than 982,600 records from RISE testing.
Nielsen said some questionable results had nothing to do with test administration or technology but how the tests were proctored or issues testing ethics.
Nielsen said there are problems with assessment each year, mostly confined to individual classrooms or schools. The 2018-19 school year was unique because the state board experienced “systemwide, high visibility interruptions of service,” Nielsen said.
State education officials were initially scheduled to present the test results and associated analyses to the State School Board in September, but Questar missed a July 15 deadline to deliver the data. It was delivered on July 23 but state officials asked the vendor to resolve issues with the data.
“We got our final data file on Aug. 2,” Nielsen said.
The state education officials had also planned to issue school report cards in December based on the July 15 delivery of data and supposing they could defend the data’s use for accountability purposes. Now the report cards might not be issued until early 2020, Nielsen said. Statewide testing results are one of several indicators that make up the report cards.
According to the legislative review, State School Board employees are working with the U.S. Department of Education to address federal reporting requirements that may be affected by delays in the receipt and analysis of RISE testing data.
Nielsen said the state education officials have taken educators’ concerns about the data seriously, “which is why we have spent significantly more time this year looking at our results than any year I’m aware of.”
Legislative auditors surveyed 90 school districts and charter schools to determine the impact of Questar’s testing issues.
Ninety-one percent reported they were impacted or might have been negatively impacted, according to the review.
“They reported that the delivery system failures created significant additional work for LEA (local education agency) employees and negatively impacted staff and student morale. One survey participant reported having to cancel classes because of the need to quarantine computers.
“Respondents also reported loss of confidence in the data for this testing period, citing incomplete tests, lost data, discrepancies between scores on different reports, and underperformance by students due to interruptions and strained testing environments,” the review states.
Nielsen said he doesn’t discount the frustration felt by school administrators, teachers and students over the disruptions and the stress some feel about state and federal accountability programs in general.
Interestingly, the testing outcomes for students tested on the five days last spring when testing slowed or had to be shut down “aren’t noticeably different than the outcomes for other students,” he said.
The board had a multi-year contract with RISE testing vendor Questar but voted in June to terminate it after technical issues and other problems that plagued the statewide testing program in the spring.
In August, the board then resumed its relationship with test vendor American Institutes of Research, makers of the SAGE test, as an interim provider of statewide assessments for students in grades three through eight. The board voted to enter a three-year contract with AIR for $21.6 million.