- Who We Are
- Where We Stand
- Our Rights
- Our Benefits
- Our Chapters
- Administrative Education Analysts and Officers
- Education Officers & Education Analysts
- Guidance Counselors
- Hearing Education Services
- Hearing Officers (Per Session)
- Lab Specialists
- Occupational / Physical Therapists
- Retired Teachers
- School Nurses
- School Secretaries
- Social Workers & Psychologists
- Speech Improvement
- Supervisors of Nurses & Therapists
- Teachers Assigned
- Vision Education Services
- Other DOE Chapters
- Charter School Chapters
- Non-DOE Education Chapters
- Federation of Nurses
- United Cerebral Palsy
- Family Child Care Providers
- Get Involved
- Career Timeline
- Teacher Center
- Teacher Evaluation
- English Language Learners
- Classroom Resources
- Students with Disabilities
- Courses / Workshops
- Teacher's Choice
- Teacher Leadership
- Transfer Opportunities
- Job Opportunities
- District 75
- Positive Learning Collaborative
- Professional Development Resources
- Team High School
UFT.org Home > News > New York Teacher > Feature stories > DOE backs off release of Teacher Data Reports in face of UFT lawsuit
by Maisie McAdoo | October 28, 2010 New York Teacher issue
The UFT scored an 11th-hour victory on Oct. 21 when the city Department of Education backed off a decision to release test-based data reports on nearly 12,000 teachers, at least until a Nov. 24 court hearing.
The DOE made that commitment before a state judge as the UFT was in court trying to block the release of the individual names.
The DOE says one thing about the TDRs in 2008 … “… a new tool to help teachers learn about their own strengths and opportunities for development …The Teacher Data Reports are not to be used for evaluation purposes. That is, they won’t be used in tenure determinations or the annual rating process.”
Chancellor Joel Klein, “Dear Colleague” letter to teachers, Oct. 1
“It is the DOE’s firm position that Teacher Data Reports will not and should not be disclosed or shared outside of the school community, defined to include administrators, coaches, mentors and other professional colleagues authorized by the teacher in question … In the event a FOIL request for such documents is made, we will work with the UFT to craft the best legal arguments available …”
Letter from Deputy Chancellor Christopher Cerf to Randi Weingarten, Oct. 1
… and changes its tune in 2010 “…we believe that the public has a right to this information.”
Chancellor Joel Klein, New York Post, Oct. 24
“Deputy Chancellor John White defended the city's efforts to release the ratings, saying the data would strengthen the city's case for changing the policy on firing teachers.”
New York Daily News, Oct. 20
The DOE had planned to turn over to the media on Oct. 22 the names and highly debatable data on 4th- through 8th-grade English language arts and math teachers, based on their students’ standardized test scores, which themselves were found to be inflated and inaccurate.
Teachers and education experts were incredulous that the DOE was contemplating such a move.
“How dare they?” demanded one chapter leader, echoed by many others, when UFT President Michael Mulgrew warned the Delegate Assembly about the possible release.
A Columbia University researcher, who did some of the original work on the reports, told a reporter that privacy was a major issue and there should be “a careful decision about whether this should be released.”
Aside from privacy concerns, Mulgrew feared that the reports are so inaccurate that they could badly mislead the public. Welcoming the delay, he said, “We’re glad that parents won’t be subjected to more unreliable information from the DOE. Our teachers can now focus on the real task of moving education forward.”
Despite a written promise to the UFT two years ago that the reports “will not and should not be disclosed,” the DOE was readying the reports for four newspapers and New York 1 news, which had requested the information under the Freedom of Information Law.
The reports use a new and complex statistical technique called value-added measurement to try to determine the impact of an individual teacher on a student’s test score. As of now, the measure is “not ready for prime time,” according to researchers. The DOE itself acknowledges that the reports have super-wide margins of error.
When the DOE claimed that its hands were tied by the Freedom of Information Law requests, the UFT was left on its own to block the release and explain, to a public primed by months of teacher-blaming in the media, why the reports are not a good measure of teacher worth.
The UFT has assisted researchers to refine the value-added methodology in hopes that it could someday become a useful instructional tool. It also agreed with the State
Education Department last summer that value-added measures could be one part of teacher evaluation when and if they could be made reliable.
“In order to do that work, you need teachers involved. But you can’t then break your promise to them,” said Mulgrew.
Teachers responded to news of the threatened release with dismay.
“Many people are not going to want to teach 4th through 8th grade. And teachers are not going to want to teach at-risk kids,” said Taneeka Jones, a teacher at PS 42 in District 27.
“You’re holding teachers’ feet to the fire on things they have no control over,” said Denise Johnson, of PS 104 in Far Rockaway.
The UFT is working with Community Education Councils and parent groups to explain exactly how and why the TDRs are not a useful measure of teacher effectiveness and to reassure them that the union’s commitment to working with parents to find real solutions to our schools’ challenges is as strong as ever.
“Parents understand their children are more than a test score,” Mulgrew said. By making education about a single data point, “you’re not really helping children.”
When they say experimental ...
Value-added measurement, with more refinement, has the potential to be a useful instructional tool and one of the multiple ways to evaluate teachers. But the UFT considers the current reports — whose arcane algorithms typically interest only a tiny pool of postdoctoral statisticians — to be flawed and unreliable for the following reasons:
- The student test scores on which these reports are based have been discredited. With testing experts from Harvard University now finding the New York State exams have been scored differently every year and tested only a narrow band of knowledge, reports that are based on them cannot give good information.
- The “value-added” measurement that the reports use keeps giving different results. Teachers’ outcomes vary widely from year to year, or classroom to classroom. The DOE reports give error margins on average of 54 percentage points for teachers with one year of data, because the statisticians just aren’t sure of their reliability.
- “Garbage in, garbage out” is an old saying about computer calculations. It’s true for the data reports. Multiple instances have come to light of teachers being graded on students they never had or subjects they never taught. If the inputs aren’t right, the outcome can’t be either.