Blog: The Educated Reporter

Reporter’s Guide to Research: Getting Smart About Education Studies

Academic research can serve up some of the most original and meaningful stories journalists could hope to cover, if only we know where to look. But Holly Yettick, a reporter-turned-researcher at the University of Colorado-Denver, says hardly anyone in the news business today is writing about the latest research on schools.  In one of the conference’s first sessions, Yettick shared her tips for finding good studies to write about and writing about them without overselling the results.

Yettick, who wrote her dissertation on what education research gets covered in the press and why it gets covered said that while 45 percent of Education Week’s stories are about academic research, less than 1 percent of daily newspapers’ education stories are. (That chart and others are included in her PowerPoint, which she’s posted online.) Newspaper reporters are not just missing out on good stories, she said, but on opportunities to add big-picture context to coverage of our schools, so the news doesn’t seem like an “endless stream of disconnected events.”

Sometimes a study’s findings are worth covering on their own. In other cases, the findings may provide good perspective to some other news. There could even be good research under way that focuses on a school district you cover; Yettick recommended filing an open records request for research proposals submitted to your local district or state education department, or browsing the district’s requests for proposals (RFP’s) to see if they’ve commissioned any research from outside. Scanning local university professors’ websites might turn up more good leads on studies focused nearby.

But Yettick said the best research to write about is often the work published in peer-reviewed journals. “Vetting research is really hard,” she acknowledged, so it’s a whole lot easier (and more reliable) to stick with studies that have withstood the intense scrutiny of peer review. Yettick described the peer review process in terms reporters can relate to: “Imagine your worst day ever with your meanest editor,” she said, “and then imagine twice as bad as that.”

In another handout Yettick has also posted online, she’s listed some good sources for new peer-reviewed research on education, including the American Educational Research Association and the Association for Education Finance and Policy. Yettick also recommended searching Google Scholar and programs from education research conferences.

Some studies will be quantitative—heavy on charts and data—while others will be qualitative. (“Kind of like journalism on steroids,” Yettick said.) To get a sense for whether a study’s worth covering, she suggested scanning the article’s abstract, then its discussion at the end, and the charts or quotes included in between.

Yettick also recommended using academic research to learn quickly about a new subject you’re writing about. “Lit reviews” round up the latest studies on a subject and can add context that’ll tell you whether some other study you’re covering is an outlier. “Any one study, even if it’s great, can be wrong,” she said. Yettick recommended John Hattie’s enormous 2008 volume Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement as a starting point for understanding the research on schools and learning.

Once you’ve found good research worth covering, the next challenge is writing about it well, without overselling or misinterpreting the results. Press releases announcing new findings often come from P.R. folks with an interest in making a study sound extra-juicy. Yettick said it’s important to look critically at what the researchers had to say and never to rely on a summary in a press release.

Yettick suggested that the fact that a study is being promoted with a press release should be a red flag for reporters to look critically at the results: If you get a press release about a study, some advocacy-oriented group has probably decided the results are worth spending money to promote. She suggested a few more common warning signs: promising a “silver bullet” to a big problem, using letter grades simplify complex findings, and suggesting a connection between two very different things. Sometimes, Yettick said, statistics-oriented researchers draw conclusions from data that they wouldn’t make if they better understood how schools work.

But Yettick said none of those are sure signs of shoddy research, just reasons to be careful. Reporters shouldn’t be too quick to write off research on a suspicion of bias. “Don’t assume someone is just a shill for a group,” Yettick said, just because they took outside funding. Nor should reporters assume a researcher is biased because they seem obsessed with a particular subject: Most researchers, Yettick said, spend their careers answering little pieces of the same big question.



Have a question, comment or concern for the Educated Reporter? Contact Emily Richmond. Follow her on Twitter @EWAEmily.

Read other Educated Reporter articles.