Elevating Your Reporting With Data: The Dos and Don’ts
How to marry basic reporting with a data set — and tips to read it correctly
You don’t need data to write a great story. But adding data, if done well, can take your reporting from McDonald’s to Michelin-starred.
These tips will help you elevate the coverage you produce every year, such as stories on student performance; ensure you correctly translate what the data is saying; and remind you to stay skeptical.
Consider the source of any data
Just as you question the motivations of people giving you information, you must question whether data is what it seems to be. Data can be manipulated and massaged to suit a specific message; it’s our job to cut through the spin.
Particularly if data is being provided through a report, ask: Who wrote this report? Who commissioned the report? What are their credentials? Where did the funding come from? Who collected the data? How did they do the analysis? Are the sample sizes large enough to be meaningful? What limitations do the data have? Do the conclusions make sense?
Dig into the specifics
Once you find the data that can make your story sing, check these four things first:
Just because something is statistically significant, doesn’t mean it’s necessarily meaningful in the real world. For instance, in a very large study of student test scores, a few points difference might be considered statistically significant, but it doesn’t necessarily mean the difference would be noticeable or important.
When reports indicate a percent increase or decrease, be sure to ask for the raw numbers. A 50 percent increase could represent a change from 2 to 3 or from 100 to 150.
When illustrating a trend over time, make sure the data was collected and analyzed in a consistent manner. If the wording of a survey question or the method of survey delivery changed, it may not be accurate to compare results from before and after that point in time.
If you see a trend in one data set, check it against another. For example, if the U.S. seems to be improving its reading scores on the NAEP test, check if the same is true for PISA test scores.
Remember to do nuts and bolts reporting
Annie Waldman, data master and education reporter at ProPublica, has some great words of caution about data: The numbers themselves are not the story; they may be a start to a story. They help reveal a potential story. Ask questions of the numbers just the same as you would of people.
For example, stories about student performance present great opportunities to start with numbers, then dig some more. If you find disparities among certain groups of children in your state, can you do a broad, day one story explaining those gaps exist, where they narrow or widen among schools, and note other patterns you see? Absolutely. But don’t stop there. Scrutinize what factors might contribute to vastly different outcomes. Are those students disproportionately low-income? If so, could that affect the resources available to them to complete work outside of school? If the same demographic of students is underperforming year after year, what is the school doing in response? How has the school tried to course-correct?
Here’s an example with some of Dawn’s work. She started this story after learning hundreds of high-achieving Illinois students were enrolling at University of Alabama. Illinoisans going to out-of-state colleges wasn’t new. The “brain drain” has been a problem for years. But most of those students end up elsewhere in the Midwest. The explanation was simple: money. Alabama awarded huge scholarships to draw students to Tuscaloosa. What parent could resist getting such a good deal for an undergraduate degree?
What started as a simple story about many students going to one out-of-state school instead became an in-depth look into how much the spiking cost of college in Illinois was pricing out and driving out local families.