The complexities of administering this year's WASL
By Stephen Kramer
When tests become high-stakes in nature, so do security measures. Last year we heard, for the first time, about WASL police -- mysterious folks who might show up, unannounced, during testing sessions. They presumably would be checking to see whether we had left an instructional poster uncovered on a classroom wall and were reading the test instructions correctly.
This year, those of us who proctored tests were required to attend test administration training sessions. There, in addition to specific instructions about test procedures, we were informed about the penalties we'd face if we strayed from proper testing procedures. The penalties include fines and the loss of our teaching certification.
Interestingly, and sadly, the new security measures also mean that teachers are no longer allowed to look at their students' responses -- or even to read the questions in the test booklets. The new policy is spelled out very clearly in the 2006 Assessment Coordinator's Manual (www.k12.wa.us/assessment/TestAdministration). The policy is also emphasized in the legalistic documents teachers are required to sign before and after giving the tests.
The reason given for this policy change is, of course, to make sure that no student receives an unfair advantage. An unfortunate effect, however, is that teachers will be left even more in the dark about what WASL scores really mean. Comments I've heard from third-grade teacher colleagues, who are new to WASL testing this year, indicate that they'll have no idea how to understand their students' scores because they weren't able to look at the test items or see how their students responded!
In past years, after my students completed their writing tests, I always looked over the final drafts in the test booklets. Some students had spent 10-plus hours over two days working on their essays, and they wanted me to read their writing. I was always curious about how my students would respond to an all-day, completely unsupported writing activity. This was a classroom assignment I never gave because it's so developmentally inappropriate for fourth-graders. But I also read the papers because I felt I had an obligation to honor my students' many hours of effort. It seemed like the right thing to do before sending off their creative work to be assembly-line scored by an anonymous reader and then shredded.
Over the years, I saw the efforts of good fourth-grade writers who ended up, in frustration near the end of the day, erasing whole sections of their final drafts. I saw final drafts where it appeared that entire paragraphs from a first draft had accidentally (or purposefully) been omitted by small hands that had become too tired to finish. And I frequently saw students who started writing to an expository prompt and then drifted into a wonderful (but off-topic) narrative piece. That's a rather common practice among children this age. Reviewing the student work gave me at least a bit of insight into the scores that came back in the fall.
This year, shortly after my students began their work on the first writing day, I had a couple of them approach me and ask whether I'd read their writing at the break. I said no, I couldn't. One asked if, then, I could read her writing at lunch. I said I wouldn't be able to do that either. "Well, Mr. Kramer," she asked, "Then can you read it after school? I really want you to read it!" I told her that I wouldn't be able to read it after school either. I explained to the students that, this year, no one at our school could read their writing. As my class listened to my announcement that I couldn't be part of their audience, I could practically feel the writing energy drain out of my students.
In previous years, when I looked through student work in the reading and math sections (after the testing session was over), I was sometimes surprised by high-achieving students who had marked what seemed to be incorrect answers to multiple choice questions. Sometimes I'd ask these students about their answers. Almost invariably, I found that the students had thoughtful and logical reasons for answers that I knew would be scored as incorrect.
Or, I'd see a response to a longer-answer problem that intrigued and puzzled me. The child may not have been capable of explaining in writing how he/she got that answer, but when I asked the child about it (after the testing session was over) I often found creative and reasonable thinking behind the response. Again, I was quite certain the answer would be marked wrong, but I had gained some insight into a student's thinking. Because I'm no longer allowed to look at my students' work, I have less knowledge about my students' WASL performance this year than any year that I've given the test.
Several years ago at a local school, a group of fourth-grade teachers was looking over a section of the math test in the morning, just before students arrived at the school to take the test. None of the teachers in the group could figure out how to find the answer to one of the problems. They were almost ready to call OSPI to report that there was an error in the test booklet when another member of the team came in to take a look. That teacher did figure out that there was a way to solve the problem, so the call wasn't made. But how can teachers ever know if there are mistakes or problems with the tests if they aren't allowed to look at the items?
As an elementary teacher whose WASL administration experience dates back to the year the WASL was piloted, I have seen many curious things: changes in cut scores to increase pass rates, wholesale rescoring of writing tests, students of mine who I can't believe passed and students who I can't believe failed, and fourth-grade math questions in the early years that were so difficult, confusing and poorly written that, in the words of a colleague from another district, "even fourth-grade teachers couldn't figure them out." Many of these experiences left me shaking my head.
This year, I'm shaking my head again. Although testing/measurement experts widely support the use of multiple measures of student learning, our state policymakers have jettisoned the ITBS and reduced our measure of academic achievement to a single, high-stakes test. Unfortunately, new policies prohibiting teachers from reading any of the test items or student responses have made the WASL even more secretive and the meaning of WASL scores even more of a mystery.
Stephen Kramer teaches fourth grade at