Blog: The Educated Reporter

Dissecting the Data on Charter Schools

A new study from Stanford University’s Center for Research on Education Outcomes (CREDO) concludes that charter school students in some states are making respectable academic gains while others are falling behind their peers at traditional public schools. (You’ll find a handy aggregation of the media coverage at EdMedia Commons.) The new CREDO report, like much of the charter schools research, is expected to spur criticism of its methodology and findings.

At EWA’s 66th National Seminar, held at Stanford last month, we examined the challenge of how reporters can best evaluate charter school research in a session moderated by the Huffington Post’s Joy Resmovits. We asked some of the education reporters attending the seminar to contribute blog posts from the sessions. Today’s guest blogger is Erica Green of the Baltimore Sun. The full podcast of can be found here. Stream any session from National Seminar in your browser, or subscribe via RSS or iTunes.

The stories about the successes and failures of charter schools are often subjective. That’s because the even the most objective measure used to tell those stories—research—may be skewed. So, what can reporters do to ensure they are telling the most accurate story possible?

Jeffrey Henig, of Teachers College, Columbia University, and Margaret “Macke” Raymond, of Stanford University’s Hoover Institution, Center for Research on Education Outcomes, offered some simple pieces of advice: Do your homework; tap your natural skepticism; and beware of painting the picture of charters with broad strokes.

More importantly, Henig said, beware the most common mistake: “Never look for a single study to answer a question once and for all.”

That’s because no charter school, anywhere in America, is alike. The families who choose them are each different, creating a natural “selection bias.” More importantly, charters’ role in the education world is part of a complex ideological and partisan debate, realities that “raise the stakes of how research is positioned and presented in the public arena,” Henig said.

Henig urged reporters to question the “internal validity” of the research that would lead one to make clear inferences about the quality of a charter school in that study and to be particularly skeptical of “external validity” in charter research when determining whether any of the conclusions can be applied generally and broadly to all charter schools.

According to Henig, several reports fail to present one of the two accurately, so he suggests cutting out the intermediaries when reporting a story and going straight to the researchers. Ask them how confident they are in the research and what they believe the other side of the argument is.

“If they say they are the study of all studies,” he said, “your antennas should start to quiver.”

In the same vein, Raymond warned that reporters should also be careful when they look to localize a national issue, or look for trends. For instance, when a national charter school story breaks (Read: Oakland’s American Indian Charter financial scandal), “we have to resist the temptation to extrapolate,” she said.

Apply the same resistance to the temptation to find trends, Raymond said. “We are no longer in a world where you can take four data points, draw a line through it, and call it a trend,” she said.

Charter school research, Raymond said, is no longer “an area of quick study.”

For researchers, the bar has been raised—a study that used to take her shop six to eight months now takes two to three years. Raymond said reporters should thus take more time examining the research. But, she added, they usually don’t have the time and attention spans to delve into report, and that by the time she’s fully prepared to talk to reporters, they’re on to the next thing.

Unfortunately, that means news stories on research may be overly influenced by advocates or others with a policy agenda. And politicians sometimes want to form policy based on what could be an incomplete picture.

So, in navigating these murky waters, Raymond suggests journalists ask themselves: What’s meaningful vs. what’s statistically significant? Pose those questions to everyone you interview, and put their answers into context with at least two to three years of data.

Henig and Raymond pointed reporters to the several resources that they believe provide some of the most credible research. Raymond said the National Alliance of Public Charter Schools compiles charter school studies from all across the country. They are “fair” to even those that are critical, she said. Henig pointed to the CREDO report “Multiple Choice: Charter School Performance in 16 States,” considered the most objective and comprehensive charter study to date. He said that even charters had to take it seriously, because it was not aligned to any particular ideology.