Member Login

Writing About Testing: Beyond the Numbers (from the Education Reporter, March 16) Linda Perlstein EWA’s public editor

With testing season approaching in his state, a colleague recently came to me with a question.
How can he publish the list of state test scores every year, he asked, without alienating principals whose schools fall at the bottom of the list? Here are the suggestions I gave him.
They’re not designed to placate principals; rather, they’re just good journalism.

First, you have to set the stage early. If the score release day is the first time you are writing about the tests all year, you’re doing readers, and schools, a disservice. The news isn’t just the test results; it’s how instruction is shaped all year around the goal of succeeding on the test.

Second, let educators show what they’re up against. One of the most effective reporting tools I have is to phrase things in terms of "challenges you face." Confront a principal about the failure to make AYP, and you’ll receive a defensive response, if any. Ask that principal to explain to you­or better yet, show you­the challenges the school faces in making AYP, and you’ll get a much better conversation, and a much richer understanding of what’s going on. Ideally you’ll have done this well before testing, but even framing the subject in this way for your test-results day story can help.

If you have already written a story showing the dissonance between the life skills instruction some special ed students must receive and the math and reading skills they face on the state test, if you have shown English language learners struggling over vocabulary on practice tests, if you have laid out the gaps between the state reading standards for third graders and those that will actually be tested, and if you have portrayed all schools do in spite of these challenges, those scores will have much-needed context. (And the principals will be more likely to take your calls­though that’s not reason alone to write these pieces.)

If tests are approaching in a week or two, stories that show the stress surrounding testing, and the desperate and sometimes bizarre things schools do to ease that stress (or, frankly, ramp it up), are great. But even this close to T-Day it’s not too late to give more meaningful context to what’s going on. It’s not too late to ask teachers what they’re worried and confident about, or to explain a school’s forecast for itself, based on all the benchmark tests given over the year.

After the results come out, ask principals: Is this what the benchmark tests showed throughout the year? If not, do you have a guess at why? I have seen principals blindsided when the benchmarks, which were created by the school system to mimic the state test, give them no reason to think students will falter when the real deal comes, but then they do. That’s a story.

Third, at any time of year, articles can explain what information schools get beyond the basic numbers, and how that affects the usefulness of the data. In Maryland, for example, reading test results are given for each student only in the broad categories of "comprehension of literary text," "comprehension of informational text" and "general reading processes." The state testing director told me once that breaking down the data any further would render them statistically insignificant, while teachers complain that such an approach gives them little idea of how their students really performed. How well did they understand main idea? Vocabulary? Did a certain child fail simply because she didn’t finish in time?

When I was writing about testing in Maryland, teachers could see the state test before it was given. Can they in your state? They certainly see the test while they are proctoring it. Yet in many states they may not look at students’ graded tests. This secrecy, officials say, is necessary for test security­after all, most states reuse the tests from year to year­but it also can preclude educators from getting the kind of information they need to truly alter their instruction based on students’ needs. I encourage you to write about the rules in your state, and whether they help or prevent educators from helping children.

Which is the point, isn’t it?

The Educated Reporter

Speaking of testing ... I’m back to the bookshelf after some time away, and I would be remiss if I didn’t recommend a terrific book on understanding test scores. Dan Koretz at Harvard is probably on your source list already­well, he should be­but before you call him again, read Measuring Up:
What Educational Testing Really Tells Us.

Measuring Up is the best book I know of to explain, from a technical viewpoint, the real value and limits of educational testing. (For the best book on the human aspects, I could recommend
Tested: One American School Struggles to Make the Grade, but since I wrote it, that would be tacky. Still. Really. You might.)

Validity, bias, causation, sampling error: You may think you understand these terms, but Koretz puts them in Technicolor, with solid examples and accessible writing. Chances are, you’ll come away from the book wanting to do away with publishing the school test score rankings at all, and your editor will bonk you over the head and tell you to get real.

These days, the entire American education system revolves around these numbers. So it’s crucial to give them context, in a way that goes far beyond hastily acquired quotes from principals the day after scores come out. With the wisdom you’ve gained from reading Measuring Up, and the approaches I mention above, at least you’ll be able to explain better what the test scores tell us and, just as important, what they don’t.

Public editor Linda Perlstein is available to help you. Contact her at 410-539-2464 or lperlstein@ewa.org.

 

All active news articles