In the weeks leading up to November’s election, as Republicans and Democrats engaged in their biannual brawl in battleground states throughout the country, the one thing most pundits on both sides of the aisle agreed on was that Nevada’s next senator would be Sharron Angle.
Why was just about everyone so sure the GOP challenger would unseat the Senate Majority Leader?
The power of polls.
American University’s School of Communication held a workshop for journalists on June 17 and 18 that covered a range of issues essential to reporting on polls. They shape campaign strategies, focus politicians’ messages, and serve as preelection scoreboards, yet the media can struggle to effectively use polls and the information they provide.
“It’s been said that polling is only 3 to 5 percent of a campaign budget but drives 90 percent of what goes on in a campaign,” said SOC professor Leonard Steinhorn, director of the public communication division. “There’s plenty of evidence that polling influences the news narrative, and affects some voters.”
Funded by a grant from the Robert R. McCormick Foundation, the workshop brought together some of the country’s preeminent pollsters, journalists, and faculty from SOC and AU’s School of Public Affairs, which offer a joint master’s degree in political communication.
More than 20 major news organizations, including the New York Times, ABC and NBC News, Politico and Bloomberg participated, as did experts from Gallup and the Pew Hispanic Center.
“There are a number of issues which pollsters and reporters are grappling with right now including the rise of cell phones, online polling and, in particular, automated polls,” said Dotty Lynch, SOC executive in residence and director of the MA in political communication. “In addition, the explosion of social media has given rise to a proliferation of opinions which may or may not be representative of the public. Polls can also be a very good tool for reporters as a means of bringing the public into the equation. As we head into another presidential campaign, accurate data, not just about the horse race but about public policy, issues, and the state of the country will illuminate the national conversation.”
Is this poll valid?
One important question the workshop aimed to explore was this: given the importance polls plays in so many parts of the political spectrum, how can reporters determine whether a poll is valid or suspect?
Few are better positioned to answer that question than Mark Blumenthal, editor and publisher of Pollster.com. The site publishes poll results and a daily running commentary that “explains, demystifies, and critiques political polling for political insiders and the general public.”
He delivered the workshop’s keynote address on issues and trends on political polling heading into 2012. During the 2010 election cycle, Pollster.com logged more than 1,500 polls.
While harboring “real doubts” about the effectiveness of automated polls (those conducted by computer rather than a live interviewer), Blumenthal said they’re proven to be nearly as accurate when it comes to predicting election results.
He sees value in evaluating and scoring the accuracy of polls and believes that more polls should disclose how they collected their data.
In New Hampshire in 2008, just one of 18 polls publicly explained what they considered to be a “likely voter.”
“That’s amazing to me,” Blumenthal said. “You don’t have to tell me what the recipe is, you just have to tell me about the ingredients. If we can get more of this stuff disclosed we can be in a position to help journalists. The more pressure we can put on, the closer we can get to knowing what polls to trust.”
In late October and early November virtually every poll had Reid solidly trailing Angle heading into their Senate showdown. Yet when the dust cleared in the Nevada desert Tuesday night, Reid had comfortably won reelection.
“He in our data maintained his lead,” said Mark Mellman, Reid’s pollster. “Our final margin was about five points, and that was the actual result. We were the only people that had Reid ahead. The fundamental fact is that methodology matters.”
Why was he able to call a result no one else was seeing?
“One is the definition of likely voters,” he said. “We shouldn’t really care about likely voters, we should focus on the likely electorate. You never have an electorate that is made up [entirely] of likely voters.”
Not all likely voters will vote, and not all less likely voters won’t. Because his firm called cell phone numbers and redialed hard-to-reach respondents two, three, up to 10 times, Mellman’s poll was more accurate, he said.
“If you only focused on people who voted in 2006 and 2008 you’re going to get the wrong answer,” he said.
Mellman’s polls, like many done by partisan firms for specific clients, weren’t released to the public.
“At the end of the day it’s hard to know from the outside what’s a good poll and what’s a bad poll,” he said. “Let uncertainty reign."