In Illinois, a Kafkaesque series of computer glitches, printing and labeling mistakes and human error by the testing company and state education officials have prevented schools from determining whether they made adequate yearly progress on reading and math exams taken by third through eighth graders last spring. Montana officials, by contrast, argue that their tardy score reporting was in fact planned. Regardless, the delays means that students in both states are unable to know officially whether they are eligible for free tutoring or to transfer to another school if they attend a failing school, as mandated under the reform law. While a dozen states have experienced delays this year in getting their scores pulled together, none were as far behind as the bureaucrats in the Land of Lincoln and Big Sky Country.
Illinois officials initially blamed the cascading snafus on Texas-based Harcourt Assessments, which in March delivered to about a quarter of the state's 895 districts tests that were riddled with errors or had missing or duplicate pages. Some boxes arrived at schools containing no tests at all, requiring last-minute scrambling (and planes chartered by Harcourt) to distribute the exams in time. While the testing itself appeared to proceed without many problems, a mountain of mistakes ensued afterward during the largely automated scoring phase that delayed the processing. Illinois officials have also conceded to contributing to further hitches in the state's new student identification system. Designed to streamline the scoring process by assigning each student a number that included demographic and school data, in reality the scoring verification slowed to a crawl because district officials across the state entered incorrect information pertaining to race, income level and special education status for roughly 11,000 students, out of about 900,000 Illinois test-takers. The state originally promised final results would be released by Oct. 31. but officials say it will be well into the new year before the work is complete.
Montana school officials were sympathetic about Illinois' predicament, but quickly distanced themselves from the storm of errors and incompetencies that characterized their Midwestern counterpart. "We didn't have any assessments aligned to our standards prior to No Child Left Behind, so we we've had to build our tests from scratch," said Joe Lamson, communications director for the Montana Office of Public Instruction. Because 2006 was the first testing year that required assessments for every grade from third through eighth, Montana set a generous mid-January deadline to process its results. In 2005, only students in fourth, eighth and 10th were tested, so the state easily got its scores in by late August. "We wanted to give ourselves extra time this year to get all of our scoring calibrated properly. We've met all our deadlines along the way," said Lamson. The Montana spokesman couldn't lavish enough praise on its contracted testing company, New Hampshire-based Measured Progress, which he says has performed well on all aspects of its commitment. But it's worth remembering that Montana was only responsible for 60,000 student tests, less than one tenth of Illinois' workload.
Measuring the significance of late-arriving scores depends almost entirely on your perspective on standardized tests. As the federal NCLB law comes up for reauthorization next year, testing critics are quick to point to the extensive delays. "Nowhere have schools stopped functioning because of the missing test scores. But they also don't know if they've moved up or down the performance ladder," said education advocate Julie Woestehoff, executive director of the Chicago-based non-profit Parents United for Responsible Education. She hopes snafus like Illinois' may actually help in rethinking the law's parameters to include other forms of assessments in evaluating schools. "These testing errors show the need for multiple measures like student grades and performance portfolios which are not as cheap and fast to administer, but are more accurate," Woestehoff says. In 2005, 84% of Illinois schools made adequate yearly progress based on established annual targets under NCLB, up from 71% the prior year.
Indeed, criticizing the states and testing firms charged with carrying out the federal law ignores a far more crucial issue: whether standardized tests can ever really drive high-quality education. Says author and well-known standardized test skeptic Alfie Kohn: "These recent problems with implementation pale beside the appalling effects of NCLB itself. It's when this law is working 'properly' with all the tests given, the numbers obediently reported, and the attendant punitive consequences enforced that we really need to worry." Montana may not be sweating out its scores; last year, 92% of its schools made adequate yearly progress, one of the highest percentages in the nation. The state can only be grateful that speedy tabulation isn't part of the ranking system.