Blog: The Educated Reporter

Assessing the New Standards: Are Schools and States Ready?

Jeffrey Solochek of the Tampa Bay Times (far left) moderates a conversation with UNC Prof. Greg Cizek (center) and Scott Norton of the Council for Chief State School Officers. (EWA)

This spring, schools in most states are preparing for a critical juncture with the Common Core State Standards: Their students will take state tests pegged to the standards for the first time.

At a recent Education Writers Association seminar, a pair of assessment experts offered analysis of the shifts underway in states and key issues reporters should keep in mind as they cover student testing, including the shrinking number of states committed to shared testing, the unique challenges of online testing, and the potential risks to test security.

A lot has changed since the standards for English language arts and math were finalized and adopted by most states in 2010. Some states have either repealed or are formally reviewing the standards, while others have abandoned prior plans to participate in common assessments. There are two state coalitions that have developed common assessments, fueled by $360 million in federal aid: the Smarter Balanced Assessment Consortium and the Partnership for Assessment of Readiness for College and Careers (PARCC).

At the EWA seminar, which took place Jan. 12 at the University of North Carolina at Chapel Hill, the assessment experts discussed the many changes states are going through with regards to the standards and tests.

Scott Norton, the director of standards, assessment, and accountability for the Council of Chief State School Officers, highlighted the shrinking number of states committed to giving shared assessments from either the PARCC or Smarter Balanced consortia. He noted that during the “high point” in 2010, 45 states plus the District of Columbia were planning to use the consortia-developed tests. Now, the figure is down to 27 states.

Gregory Cizek, a UNC professor of educational measurement and evaluation, said that regardless of which test is used, it is important to note the difference between what students are actually learning and what the standards say they should learn:

“It matters more what’s taught than what’s supposed to be taught,” he told the EWA audience.

Cizek also pointed out that states using an alternative to the Common Core may have equally strong standards.

 “[The] Common Core State Standards are good, but not sacrosanct,” he said. “It’s important not just to call something ‘Common Core’ or a set of rigorous standards.”

There can also be a difference in how people think about Common Core assessments and what those exams look actually like, Cizek said. In addition, he said he has noticed a growing concern from the public that students are spending too much time taking standardized tests.                         

“People are realizing it’s going to take a lot of time and money to create a rigorous assessment,” Cizek said.

Since the PARCC and Smarter Balanced assessments will be administered this spring, many schools participated in “field tests” in 2014 to practice taking the exam and work out any testing problems that arise so they can be worked out in time for the real assessment.

Although Smarter Balanced and PARCC both offer pen-and-paper options, they are designed to be online exams. Norton said schools administering these computer assessments may run into some issues, especially in schools and districts with limited computers or broadband access.

“There’s going to be some bumps along the way,” he said. “There are going to be growing pains on the system and infrastructure side.”

Cizek warned that there are also potential issues with test security – since the assessments have a wide time period to be administered. He noted that students taking the exam later in the testing window may have an advantage because they might overhear their peers talking about the exam’s content, thereby possibly jeopardizing the reliability of results.

Another problem that comes with the first year of a new assessment is comparability. Norton said this year’s test scores will be difficult to compare to previous years. He said he also expects many students to receive low scores since they will be unfamiliar with the format and this will be the first time the assessments are administered across the country.

“It’s going to be pretty difficult to compare new tests to old and make a reasonable conclusion,” Norton said.

Once the test results are available, every state can define a “proficient” score differently, Cizek said. The “cut scores” will be set by the consortia, but it’s up to individual states to determine the stakes that are attached — for schools and students — to those results.

Cizek said this is another reason first-year results will be tough to compare to the past, because high-performing states like Massachusetts will likely have a different definition than states such as New Mexico or Mississippi, which typically post some of the lowest test scores in the country.

Both Norton and Cizek offered several ideas for reporters to pursue in their respective states when it comes to assessments this spring:

  • Ask your state what their plan is for explaining the new assessments to the public, particularly parents of students who previously were deemed “proficient” but not fall below that category.
  • Will there be technical dress rehearsals to minimize difficulties during the actual testing cycle? How are staff being trained and prepared?
  • Ask districts what they they are forecasting for the new assessment: How will scores be different from prior years?
  • Are students ready for an online standardized test? Ask them about their experience. Do they feel prepared?
  • Once the tests are taken and scores are released, how does the percentage of students who scored “proficient” compare between states?