New York City spat over publishing teacher rankings reaches brief truce
New York City schools want to give the 'value added' ratings for its teachers to the press. The teacher's union is suing. Friday, the district agreed not to release the data before a Nov. 24 hearing.
Newscom
The teachers union in New York City won some time today in its battle against the school district’s plan to give to reporters “value added” evaluation data that ranks teachers against their peers.
The United Federation of Teachers filed a lawsuit in state supreme court saying that data reports did not qualify as something that needed to be released under the freedom-of-information laws. It argued that the data reports are often unreliable and if released would “cause the public to form unsupported conclusions as to teacher quality” and “irreparably harm the professional reputations of educators.”
The school district had planned to release the reports with teacher names to seven news organizations Friday but agreed today not to do so before a Nov. 24 hearing.
As more school districts around the country begin to link teacher evaluations to their students’ gains on standardized tests, controversies continue to bubble up about what uses of such data are appropriate and fair.
This summer, The Los Angeles Times sparked national debate when it published a series of stories and a database of 6,000 teachers and their rank among their peers based on a value-added analysis by reporters.
Secretary of Education Arne Duncan has weighed in on such matters saying that parents and the public have a right to know about teachers’ performance.
What are value-added scores?
Value-added scores are meant to show the effect a teacher has on his or her students’ test scores. The methodology attempts to isolate the teacher’s role and make comparisons among teachers by statistically controlling for various factors among their students, such as ethnicity, disabilities, and socioeconomic status.
But because such analyses are relatively new, and there’s no consensus yet on the accuracy, usefulness, or ethics of publicly reporting such data at the individual teacher level.
“There are bad teachers out there, and we have not had any ability as communities to identify them, and this data will help do that,” says Kelly McBride, a senior faculty member focused on media ethics at the Poynter Institute in St. Petersburg, Fla. But the onus is journalists who release such data, she says, to “really do an honest assessment of the flaws [in the data], and then give the audience the information it needs to put the data in the appropriate context.”
Others say there are so many complexities that putting out individual scores may actually mislead the public.
“Publicizing these very bad measures will do more harm than good,” writes New York University professor Sean Corcoran in an e-mail. In his research of New York’s teacher scoring he has found that “the majority of teachers could not be statistically differentiated from 60 percent or more of all other teachers.”
While the data might be useful for broad policy decisions or even as one factor in personnel decisions, publishing it with individual names attached is problematic, says Jeffrey Henig, an education and political science professor at Columbia University’s Teachers College. For one, he says, what will district officials do if parents start requesting that their students be placed in the classrooms of higher-scoring teachers?
More broadly, Professor Henig says, “we’ve got some major issues of teacher recruitment and retention as a society.... How attractive a career will it [be] if each year teachers have a score hung around their neck, with parents and others invited to take shots at them?”
In New York, data have led to less tenure
The 12,000 teachers affected have had the opportunity to access their 2009-10 reports with a password, says New York City schools spokesman Matt Mittenthal.
The data reports have started to factor into personnel decisions in New York. “This year for the first time we advised principals to use them as one of several different considerations in deciding tenure,” says Mr. Mittenthal. This was recommended for teachers that fell in the top and bottom 5 percent. “As a result, we saw the percentage of teachers receiving tenure go down.”
The union “took part in the DOE’s [Department of Education] ‘value-added’ experiment with the hope that it would produce a product that would help teachers improve their craft. Unfortunately, this has not happened,” said UFT President Michael Mulgrew in an Oct. 20 press statement.
One of at least seven news organizations that have submitted a request for the value-added scores is GothamSchools, an online New York education news source. Editor Elizabeth Green says their intent is to use the data as a reporting tool, but not to publish individual teacher names and scores.
They hope the data can be a starting point in more reporting on teacher effectiveness, and it would be placed in the context of research about how the scores are created, she says. “I’m not sure every news organization is going to be as responsible.”