EdMedia Commons Archive

Five Questions For … U.S. News & World Report’s Robert Morse on New `Best High School’ Rankings

U.S. News & World Report’s annual rankings of the nation’s “best” high schools are out, and there’s an interesting mix of traditional campuses, specialized programs, magnets and charter schools. Robert Morse, U.S. News’ director of data and research, spoke with EWA about what’s new this year, what sets the rankings apart from those produced by other news outlets, and what’s next for quantifying a school’s achievements.

1. Why did you decide for the first time to include a demographic breakdown of student populations?

We wanted to give people more information about the makeup of those schools – and what we found is proof that the top campuses aren’t just in high-income suburban districts. Of the schools receiving gold medals, 40 percent receive federal Title I funding. In the bronze-medal schools, almost two-thirds are Title I recipients. That sort of shatters the myth that good schools are only in upper-middle class suburbs without minority or low-income students.

2. Most magnet schools and specialized campuses like the first-place finisher – The School for the Gifted and Talented in Dallas – practice selective enrollment. So why include them in the ranks of the “regular” high schools?

Our methodology includes all public schools that meet the criteria. We’re not going to decide which group is going to be excluded or try to define what a normalized public school might be compared to a school created specifically to be a premier school. At least some of the public has an opportunity to attend them, and it’s worthwhile to showcase their achievements.

3. There were issues last year with schools receiving unexpectedly high rankings due to inaccurate enrollment figures being used in the calculations. The data was drawn from the U.S. Department of Education’s Core of Common Data, which is supplied by individual schools and states. What steps were taken to safeguard the accuracy of this year’s rankings?

We made a mega-effort to prevent that from happening again by comparing the last three years of enrollment numbers and looking at the data far more closely. If a school’s enrollment dropped from 300 to 100 we would flag it, take a closer look, or pull it out of the rankings.

4. A number of news organizations, including Newsweek and the Washington Post, produce high school rankings. What sets U.S. News & World Report apart?

Their rankings are based on one factor – the number of (Advanced Placement and International Baccalaureate) exams taken compared to the number of eligible students. That doesn’t factor in pass rates; it’s only based on participation. Our rankings consider statewide assessments, and we ask people to understand the concept of relative performance. It’s not as simple as saying “the best school always has the highest results.” We’re saying on a relative basis a particular school is doing better than expected given the mitigating factors.

In other words, it’s actually possible to have a strategy of having kids take the AP and IB exams in massive numbers and get on the other rankings. In ours, the students actually have to learn something.

5. The rankings don’t tell us how students do once they leave a high-scoring high school – for example, what percentage of them go on to postsecondary success. Is that something U.S. News might try and tackle?

I’m not sure how we would get the data to track college students back to their high schools. The next step for us will be when there’s real cohort graduation rates to factor into our analysis, and that’s coming. I’m not saying it’s not important to figure out if students are succeeding in college, it obviously is. The question is how would we do it? That would require a big jump analytically and would take a lot of contemplation.


This post originally appeared on EWA’s now-defunct online community, EdMedia Commons. Old content from EMC will appear in the Ed Beat archives.