Member Login

Wed 16 Jan 2013 10:11:05 PM CST : This site is about to be upgraded to a new software release. If you are in the process of entering information, please complete it in the next few minutes and then log off, to ensure that you are not interrupted. If you were about to start entering details, please wait until this message is removed. You may continue to browse content on the site during the upgrade if you wish. We apologize for any inconvenience this may cause.

Testing: DIGGING FOR DATA, April 14, 2008

DIGGING FOR DATA

DIGGING FOR DATA

A column by EWA Public Editor Linda Perlstein
In a Milwaukee Journal-Sentinel piece Alan Borsuk wrote last fall about the black-white gap in Wisconsin’s NAEP scores—the nation’s largest—the state superintendent said, “I find it very distressing to look at this. There isn’t anything more important [in education].”
Apparently, there are more important things in the Department of Public Instruction’s press releases. In the release’s 10 sentences of quotes attributed to the superintendent, Elizabeth Burmaster, she didn’t mention the racial gap.
State and school system press releases about test scores rarely merit more than a glance—and a skeptical glance at that. Reporters may as well toss them in the recycling bin and go straight to the raw data. Not the data appended to the press release, but the data provided by the test-giving body: the College Board, NAEP, the state, and so on.
In the case of Wisconsin and NAEP, the department’s press release led with the fact that scale scores in reading and math exceeded the national average, and that scores increased statewide in three of four categories. Alan mentioned those positives lower in his story, but his lede had a different focus: “The average reading ability for fourth- and eighth- grade black students in Wisconsin is the lowest of any state, and the reading achievement gap between black students and white students in Wisconsin continues to be the worst in the nation.” He didn’t have to look far in the raw numbers to see this was the case, but he certainly wouldn’t have noticed had he just seen the press release, which said about the matter only, “By racial/ethnic group, achievement gaps for Wisconsin also are apparent.”
Some common tricks to look out for:
* Touting gains based on a small number of students. This happens frequently with demographic subgroups. Andy Gammill of the Indianapolis Star pointed out that in a February press release from the Indiana Department of Education, titled “Indiana continues gains in Advanced Placement results,” the state Department of Education highlighted a 44 percent increase in the number of American Indian students taking the AP, and a 42 percent increase in the number of them scoring 3 or higher. A small bit of research shows that only 22 Indian students had taken an AP test the year before, so that large percentage bump represents only few more kids.
Tara Manthey of the Arkansas Democrat-Gazette spotted this same trick in her state’s most recent AP press release, which led with the news that Hispanics and American Indians “eliminated the performance gap” on the test. The American Indians’ gains were based on such a tiny number of students as to possibly have been statistically insignificant.
* Failing to mention key mitigating factors. While the Arkansas press release’s assertion about Hispanics was not an outright lie, Tara points out, it was at the very least misleading. Hispanics’ apparent closure of the gap was almost entirely attributable to native Spanish speakers taking the Spanish language and literature exams; Hispanics didn’t close the gap in English, the sciences or math.
* Leaving out entire sectors of the population. Press releases often mention only the subjects or demographic groups whose scores improved. The Arkansas AP press released barely mentioned African Americans’ performance, which was quite dismal compared to whites’.
Sometimes officials are daring enough to leave out entire grades. When David Harrison started on the schools beat for the Roanoke Times in 2006, he wondered why an extensive PowerPoint the then-superintendent was showing around—taking credit for a narrowing of the achievement gap based on state test results—didn’t jibe with data provided by the Virginia Department of Education. The state’s analysis showed a less sunny picture for Roanoke.
It turned out the superintendent was basing his boast on a three-year “rolling average” of results that completely omitted fourth, sixth, and seventh grades, which had only been tested one year. It’s not a bad idea to base your analyses on the longest time frame possible, of course. It’s a bad idea, though, to brag about “division-wide achievement,” as the superintendent did, without mentioning what you’ve left out of the equation.
* An emphasis on AP participation without mention of scores. In the Indiana release, the superintendent boasted, “More of our students are making the connection between challenging high school courses and college-readiness.” But whether or not they are college-ready is another matter: The percentage of students scoring 3 or above, generally the minimum threshold for receiving college credit, on at least one test during high school increased only from 9.2 percent of the student population to 9.7 percent. (For minority groups, the percentage scoring 3 or higher actually decreased.)
* Last, but not least, the misleading headline. “Indiana school accountability ratings show improvements,” a state release this month proclaimed. Well, even more so, they showed declines. The headline was talking about the fact that 21 percent of the state’s schools moved into a higher performance category, compared to 17 percent last year. Go further down the page, though, and you see that 23 percent of schools moved into a lower category. (The rest stayed the same.)
There are, of course, many more ways states and school systems slice and dice their data in hopes that rushed reporters don’t dig deeper on deadline. So don’t let them get away with it—always dig deeper.

 

All active news articles