Blog: The Educated Reporter

Word on the Beat: NAEP
What reporters need to know about the 'nation's report card'

New achievement data in math and reading for the nation’s 4th and 8th graders was released in October 2019, showing troubling declines or stagnant scores in most areas. Alongside the national snapshot were state-by-state results, plus scores for 27 urban school systems participating in a pilot program. (Mississippi was singled out for praise as the only state to show gains in its reading scores, and for leading the country in gains in both 4th grade reading and math.) The NAEP report card reveals trends over time, plus offer breakdowns by race, ethnicity, poverty, and other factors. Here’s what reporters need to know about the assessment and its implications.

What it means: The National Assessment for Educational Progress, sometimes referred to as“the nation’s report card,” is given every two years to a representative sample of the nation’s students in grades 4 and 8 to gauge achievement in reading, writing, and mathematics. Twelfth graders also are tested on roughly the same schedule. Specialized subject tests, including science and civics, are administered, on a rotating basis at all three grade levels, but are less frequent. NAEP is a low-stakes test — meaning it doesn’t appear on a student’s transcript or impact their ability to advance to the next grade or graduate.

Why it matters: NAEP is the only assessment system designed specifically to track student achievement nationally over the long term. The test questions are designed to measure what students know and are able to do. And the questions stay relatively the same over time for greater consistency in comparisons, according to the National Assessment Governing Board (NAGB) which oversees NAEP. The data can serve as an important backdrop for educators and researchers in evaluating public schools, and looking at trends by student demographics, including geographic location, socioeconomic status, race, special education status, and gender.

Who’s talking about it: At the federal level, NAEP scores are often leveraged as either an indictment or validation of a presidential administration’s educational priorities, even when the scores reflect time periods under different leadership. That scrutiny can certainly ramp up during election years. After 2017’s stagnant scores were announced last year, some pundits asked whether there had been a lost decade for academic progress.

To be sure, NAEP results offer a snapshot of U.S. student achievement. The 2017 results carried an unusual amount of weight, as they were seen by some as a referendum on the Common Core — a national initiative adopted by a majority of states to use a common set of standards and expectations for grade-level skills and knowledge. Scores were stagnant in both reading and math among fourth graders. Eighth graders made slight gains in reading over the 2015 cohort but no progress in math. As Lauren Camera reported for U.S. News & World Report, even more troubling was that the poorest-performing students did worse in 2017 than 2015, widening the gap between students scoring in the bottom and top proficiency brackets. And, as Education Week’s Sarah D. Sparks pointed out, a similar widening gap showed up in U.S. students’ performance on two international tests of their math and reading proficiency.

What to remember: Correlation is not causation. As Morgan Polikoff, an education professor at the University of Southern California put it, “friends don’t let friends misuse NAEP data.” The National Assessment Governing Board will be the first to say that the test results cannot be used to gauge the impact or effectiveness of a particular educational intervention. (That, of course, won’t stop critics of specific programs and advocates of others from doing precisely that.)

That being said, because the assessments aren’t linked to any local or state learning standards, NAEP can be used as a barometer to compare with results from states’ own tests, especially when it comes to reading and math. Several studies have compared levels of student achievement between state tests and NAEP, often finding large gaps that serve as red flags that the state expectations for student performance may be lower than those of NAEP.

Want to know more? Here’s some background from Education Week on the decision to scale back the NAEP testing schedule and drop four subject tests as a cost-saving measure — a move that could have implications for education researchers. Also, the journal Education Next offers some predictions for this year’s NAEP results.

*This post is periodically updated to reflect the most recent news, reports, and data. 



Have a question, comment or concern for the Educated Reporter? Contact Emily Richmond. Follow her on Twitter @EWAEmily.

Read other Educated Reporter articles.