Blog: The Educated Reporter

Education Research: Where to Find It and How to Evaluate It

Researchers at Michigan State University and Teachers College, Columbia University, tackled an intriguing question in a 2016 study: How much influence were large, national donors having on local school board elections?

The study’s abstract stated that large donor networks had “nationalized” local education politics in Los Angeles, Denver, New Orleans and Bridgeport, Conn.

But the real news didn’t appear until halfway through the study: National donors gave almost $4 million to candidates and committees in the four cities in the 2012 and 2013 election cycles, which made up close to half of the total campaign donations in those school board contests.

The example shows how academic research can be a gold mine for education reporters, if they know where to look.

“How can we use research to do our jobs better?” asked Jamaal Abdul-Alim, the education editor for The Conversation, and the moderator for a session on that topic at the Education Writers Association’s 2018 annual conference.

Don’t Dwell on the ‘Abstract’

Denise-Marie Ordway, the managing editor for Journalist’s Resource out of Harvard University, provided the school board elections study as a way to underscore her first tip: Don’t spend too much time on the abstract.

“What researchers consider to be the most important takeaway from a study isn’t always what a reporter would think is the most compelling part of a study,” she said.

Ordway was joined on the panel by Ed Pauly, the director of research and evaluation at The Wallace Foundation.

Both panelists said that it’s not easy for reporters to quickly determine the quality of a study or the quality of a journal, or to know whether the findings they’re reading are legitimate. As a result, reporters run the risk of giving all published studies equal weight in their reporting. 

Research ‘Whiplash’

That can lead to a kind of research “whiplash,” said Matt Barnum, a national education reporter for Chalkbeat who frequently breaks news from academic papers.

“If someone says a study is legitimate, I think we should validate that or invalidate that by talking to experts, or we should mention the study’s limitations,” said Barnum, who spoke with reporters about his work in a separate session.

“And then some studies are illegitimate, and we should ignore them,” Barnum added. “It’s our job to figure out where a study falls on that spectrum.”

So, where to start?

Ordway said that first, reporters must be mindful of the difference between correlation and causation. You can’t infer one occurrence caused another just because the two happened at the same time, she said.

For example, if test scores increased after a school decreased class sizes, or after it implemented a new instructional program, that doesn’t mean that the particular intervention caused the achievement boost.

Ordway said reporters should not assume a study is bulletproof because it’s from Harvard or another elite institution. Instead, she said, they should look for specific indicators of quality:

  • Is the study published in a peer-reviewed journal?
  • Can the results be replicated?
  • Did it present all the data and statistical analyses?
  • Were there conflicts of interest?
  • Does it make inferences that feel too broad based on the sample size?
  • Does it explain the findings while also noting limitations?

Ordway said it’s also important to ask who is affected by the findings. For example, does a research finding apply to college students in one state? Or college students nationwide?

Pauly said assessing the credibility of academic research is a lot like assessing the credibility of sources a journalist interviews. He suggested reporters mine literature reviews, or research articles written by experts who curate high-quality papers on a topic. Pauly recommended reporters search for those in the Review of Educational Research.

A Springboard for Reporting

“What’s open and unresolved, and what’s no longer in dispute?” Pauly asked. “That can be a springboard for your reporting. And then you can also tell your readers what claims are really solid.”

Pauly also suggested that education reporters establish sources in academic institutions who are willing to review a study for whether it’s important and reliable. Ordway echoed that point.

“Have good education faculty as sources,” she said. “Then you can go to them and say, ‘What do you think of this?’, and get some feedback. That’s invaluable.”

Pauly said one way to assess the quality of an academic journal is to see where it ranks in Journal Citation Reports, which evaluates and compares thousands of journals based on how frequently they are cited by others in the field. A journal’s “impact factor” from JCR is generally listed in all Wikipedia entries, Pauly said.

The process of ranking journals is used to help decide which professors get tenure; generally, professors who publish more frequently in higher-ranked journals may be looked upon more favorably. But as a byproduct of that process, Pauly said, journalists can often make “limited decisions” about a journal’s quality.

As for how to keep up with the latest research, the panelists recommended several steps:

  • Sign up for alerts from leading education research journals. Some of the larger ones have press departments that give out codes to access the journal for free.
  • Find out how to get alerts about new papers or research from your local university and/or research institutions.
  • Sign up for the weekly email from Journalist’s Resource at Harvard University.


Have a question, comment or concern for the Educated Reporter? Contact Emily Richmond. Follow her on Twitter @EWAEmily.

Read other Educated Reporter articles.