Teacher Evaluations: Education Reporting That Measures Up
How teachers are evaluated is one of the most rapid changes in education policy, said Mackenzie Ryan, a Florida Today education reporter who moderated a panel on the topic at EWA’s National Seminar in Nashville.
With that as the backdrop, Lisa Gartner, a Tampa Bay Times reporter, and Patrick O’Donnell from the Cleveland Plain Dealer shared how they covered the topic in their home states.
In Florida, the state released the VAM scores — otherwise known as the value-added model — that were intended to measure teachers’ effectiveness. The formula to calculate the VAM score was so complicated, Gartner said one superintendent she interviewed referred to it by a four-letter word.
With most of the Florida papers covering the story on deadline, Gartner said she set out to write a story about a complex topic that people could connect with, and want to read. She didn’t want her story to be too technical or full of jargon.
“If you don’t understand something, you don’t want to talk about it,” she said.
She also wanted to give frustrated teachers a voice and write a story with the human element.
Her story focused on the district’s “Teachers of the Year” finalists and past winners. These were educators who are praised for their work in the classroom and receive special recognition – including limo rides and campus celebrations — in what has become an annual tradition at most Florida public school districts.
One challenge, Gartner said, was talking about a very personal topic with her sources.
She emailed and left voice messages for the teachers, asking if they would weigh for her story because their opinions were important and would help represent other teachers.
In the end, Gartner got what she needed for the story. (For more from Gartner on the project, check out the EWA Radio podcast.)
So how did the teachers of the year, the supposedly best of the best, do on the VAM? Under the headline “Confused by Florida’s teacher scoring? So are top teachers,” Gartner reported that “seven out of 10 Hernando County Teacher of the Year finalists whose VAM scores were available had negative scores.”
Several teachers commented for her story and questioned the accuracy of the VAM scores.
As one put it, “Whether you smile at a child or are stern with them … whether you can convince them to try a new book — how do you evaluate that?”
Covering a different angle on teacher evaluations was O’Donnell, who worked with StateImpact Ohio, to analyze the data as the state began rolling out the VAM data from the 2012-13 school year. The collaborative series was called “Grading the Teachers.”
O’Donnell’s goal was to analyze the trends. What effect did teachers’ age, pay or experience have on their VAM scores? How did a school’s demographics or a teacher’s having a master’s degree factor into a teacher’s VAM scores? He looked at questions such as: Do teachers’ VAM scores improve over time?
“If you assume and accept VAM stands for something, you can check it against other things,” O’Donnell said.
One of their findings was there was little connection to Ohio teachers’ salaries and “how much knowledge they impart to students over the course of a single year,” O’Donnell said.
“That’s true in Cleveland, where teachers deemed ‘Least Effective’ by the new state evaluation system earned, on average, about $3,000 more than the teachers deemed ‘Most Effective,’” O’Donnell and Molly Bloom wrote in a story that was published in June 2013.
For O’Donnell, it was difficult to include comment from teachers with the lowest rankings. StateImpact Ohio contacted 100 teachers, but only two got back in touch. One valuable source ended up being a teacher who was so devastated by her bad ranking, she was leaving the profession and wanted to share her story.
There were other challenges, too. The state released results for 4,200 teachers who had two years’ worth of data. Some critics contend that reporters should wait for three years of data to get a more accurate portrayal of a teacher’s performance.
In the end, the series included a database where readers could search for teachers by name.
“We needed to say, ‘This is how the state is rating your teacher,’” O’Donnell said. “If it’s good enough to count, it’s good enough to print. Why are we not telling people?”
On the decision to name teachers, O’Donnell said media outlets shouldn’t do it without careful consideration: “If you publish, know why.”