A recent audit report says the scoring of statewide tests was reliable last year, but a closer look at the data challenges that conclusion. The data show state education department graders scoring reading tests higher on average than scorers hired by the auditing firm.
The firm, Pearson Educational Measurement, asked independent graders to score a sample portion of tests. Then it compared those scores with ones given by state graders, usually teachers in the same area as the students they are grading, state and city education officials said. The point is to confirm that all students were judged against a single standard, the Pearson report says.
Data inside the report show that state graders' average reading scores were almost always higher than scores from the independent graders. Of 21 questions on the 2006 exam, state scorers gave a lower score on only three. They gave a higher score on 17, with the discrepancy ranging between 0.09 out of nine points for the third-grade test and 0.84 out of 13 for the eighth-grade test.
"That doesn't sound like a lot," a former city testing official, Robert Tobias, said. "But these very fine differences can actually have sort of a multiplier effect."
New York City reading scores mirrored the state trend.
A retired city education department data analyst, Fred Smith, said the tests' high stakes might explain the local scorers' boost. "There's a fear that this is going to be used against them," he said. "That works in the direction of leaning positive."
A state education department spokesman, Tom Dunn, dismissed the idea. "We believe in the integrity of the teachers of the state of New York," he said. "The report from the independent auditor confirms this."
A New York City education department spokesman, Andrew Jacob, defended the city's scoring process and said officials are reviewing the audit.