- Who We Are
- Where We Stand
- Our Rights
- Our Benefits
- Our Chapters
- Education Officers & Education Analysts
- Guidance Counselors
- Hearing Education Services
- Lab Specialists
- Occupational / Physical Therapists
- Retired Teachers
- School Nurses
- School Secretaries
- Social Workers & Psychologists
- Speech Improvement
- Supervisors of Nurses & Therapists
- Teachers Assigned
- Vision Education Services
- Other DOE Chapters
- Charter School Chapters
- Non-DOE Education Chapters
- Federation of Nurses
- United Cerebral Palsy
- UFT Providers
- Get Involved
- Career Timeline
- Teacher Center
- Teacher Evaluation
- English Language Learners
- Classroom Resources
- Students with Disabilities
- Courses / Workshops
- Teacher's Choice
- Teacher Leadership
- Transfer Opportunities
- Job Opportunities
- Positive Learning Collaborative
- Professional Development Resources
- Team High School
by Maisie McAdoo | May 10, 2012 New York Teacher issue
“I would have no idea how to answer those questions.”
— Daniel Pinkwater, author of the original passage, adapted in the “pineapple”
question on the Grade 8 ELA.
“Our son’s homework for months has consisted of practice tests; the main function of school seems to be to teach him to read passages of little or no literary merit and then decide which of four possible answers to equally insipid questions is the ‘right’ one.”
— Inside Schools blog, April 23, two professors who are parents of a 3rd grader
What teachers have been saying for years about the content on state ELA tests has finally resonated with journalists, professors and even the state education commissioner. After 8th-graders voiced their bewilderment over the questions on this year’s infamous “Hare and the Pineapple” passage, Commissioner John King struck it from the test.
The irony here is that these are new tests, launched to fix the problems with the old tests. They were written by a different test maker and extensively reviewed. They are designed to cover a broader swath of standards, test “critical-thinking” skills and be scored with new measures.
But so far the tests seem no better, maybe worse, than the discredited ones they replaced.
Building the plane while flying it?
In 2011 the New York State Education Department hired NCS Pearson Inc. for a five-year, $32.1 million overhaul of the state tests. They are to be aligned with new “college-ready” state standards, and with the new Common Core Learning Standards as they are finalized, and with the NAEP exams.
“It’s a very complicated weave they’re doing,” says retired DOE assessment expert Fred Smith, who has brought many testing shortcomings to light. To develop the new tests, Smith says, Pearson made them fully a third longer than last year in order to “field test” new items. Students can’t tell which are field-test and which are operational questions, but the additions added a third day of testing in each subject.
Pearson’s contract requires it to use more “informational” text — half or more of the questions must now use nonfiction passages, requiring many new items. They are supposed to test “higher-order” analytic, inferential, interpretive or predictive skills. In addition, ELA tests will be scored for “writing mechanics,” which they weren’t before.
The tests will remain in transition through at least 2015. They won’t be published after they are given, as they were previously, so it is going to be nearly impossible to review them.
Quality review suggested
So the tests are to be longer, harder and less predictable. That might be OK if they were better. But early returns suggest they are not. The principal of a Park Slope elementary school wrote the State Education Department to say they were terrible. Other teachers and principals told bloggers and reporters they’d never seen so many “trick” questions.
And if “The Hare and the Pineapple” is any guide, then either the quality control on these new tests is lacking, or they favor a kind of middling level of thinking that can trip up the brightest, the English language learners forced to take the exams, and children who simply have their own minds.
The way to get all six questions right on “The Hare and the Pineapple” was either through a process of elimination or clever guessing. Even then, of eight people — all college grads — who tried it at our request, only one answered all six questions “right.”
Nor does that seem to have been a rogue question. Published guidance for scoring the tests suggests that the highest points come from taking a plodding, literal-minded approach, using big words when simpler ones would do and writing long.
The sample 3rd-grade reading passage is from an Algonquian legend that imagines a hunter captures the sun one morning in a big net, and it must beg for release. Many animals try to help the sun and a small, determined mouse finally succeeds in freeing it. The question, which kills the beauty of the story: “How do we know this could not happen in real life?”
The top score of 3 was awarded to a student who wrote “the sun can’t talk and anyway it’s in space.” The child who wrote essentially the same thing but in a run-on sentence got a 2. The response that was scored at 0 was grammatically perfect, recounted the story and drew the moral lesson but failed to say why it couldn’t happen.
The smartest test-takers learn early: Just figure out what they want and give it to them. That is the timeworn secret of these tests and it appears nothing has changed. “They” don’t want your ideas. They don’t want you to stray from the text. They don’t want you to use your judgment, offer a novel solution or “overthink.” They want you to figure out what they want and give it to them.
Some adults might say that’s practical preparation for life. But it doesn’t make us wise.