Legacy pollsters, especially those in the media, are refusing to improve their methods, undermining the reputation of the industry, according to Richard Baris, director of Big Data Poll (BDP).
Pollsters are using limited survey methods and are asking too few people for their surveys to produce accurate results, he said in an interview.
“They just refuse to evolve.”
He’s been trying to change that.
With some five years in the business, his company produced some of the most accurate results during the 2016 and 2018 election cycles. And for 2020, he’d like to produce the “perfect poll.”
“Pollsters are failing to realize that everybody responds to the surveys differently,” Baris said.
It used to be the practice to simply pick a representative sample of Americans and have people call them on landlines.
“As time went on and as more people transitioned to cell phones, pollsters, especially media pollsters, they really chose to focus on that,” Baris said.
It’s now common, for instance, to have half the respondents gathered by calling landlines and half by calling cellphones. Some pollsters have focused on online polling instead, earning some criticism from the old guard.
The thing is, no one collection method is right—or wrong for that matter, according to Baris.
“Everybody responds to these different modes of collection at different rates,” he said.
Older and more conservative voters, for instance, respond more to landline calls and require anonymity.
Live Versus Automated
Moreover, people may be reluctant to answer honestly when talking to an actual person.
“There’s a lot of people who don’t want to tell you the truth,” Baris said. “They’re told what they should feel about an issue and they feel like … that live caller may be judging them.”
Trump approval among older, white Democrats, for instance, tends to be in “mid-double digits” when gathered online, but drops to “mid- to high-single digits” when gathered over the phone, he said.
Automated calls tend to be more successful to get people to speak more freely.
That doesn’t mean, however, that live-caller surveys aren’t valuable.
“You will be able to reach more metro, gentrified voters on that mode of collection, at a higher rate than you’ll be able to reach conservatives,” Baris said.
Online polls offer anonymity, but their participants are usually recruited into so-called panels and “the panels are generally getting more liberal as time goes on,” Baris said. “I think a lot of liberal voters have figured that out, so they opt-in to participate in these surveys a lot more now and they join panels.”
BDP pulls its online panels from voter rolls. People still need to opt-in, but the company maintains control over who is invited in the first place.
The key is the right mix of collection methods, so no group of voters is left out.
The company also weighs results not just by age, gender, race, and income, but also by region and education, which, Baris said, other pollsters are getting on board with, too.
Baris argues that pollsters aren’t asking enough people to come up with results that are precise and granular enough.
“There is no way you can get an accurate poll with 400 respondents in a state like Florida, where there are 4 million to 9 million voters, depending on the cycle,” he said.
Even within a subgroup, such as Hispanics, there’s a lot of diversity in Florida, where BDP is based.
“We always oversample for Florida polling, we always oversample Hispanics, we always ask Latinos detailed questions and we’ll know whether or not they’re not just simply Hispanic, but whether they’re Puerto Rican, or Cuban, or Mexican and what region of the state they’re coming from,” Baris said.
“A lot of people are trying to cut corners and save costs so they’re letting that slide through the cracks.”
Baris boasts a respectable record in predicting election results.
In 2016, BPD polls were generally more accurate than the average of other polls in the 10 crucial battleground states.
While the polling average by Real Clear Politics projected Donald Trump leading in Florida by mere 0.2 percent, leaving the nation in suspense, BDP had Trump ahead by a more comfortable 1.6 percent. He won by 1.2 percent.
The difference was more marked in Michigan, where pollsters called for a Hillary Clinton victory by an average margin of 3.4 percent. BDP had Trump up by 1 percent. He won by 0.3 percent.
Perhaps BDP’s greatest achievement, though, was Pennsylvania. The average of polls had Clinton up by 1.9 percent, but BDP had Trump ahead by 0.6 percent; he won by 0.7 percent.
BDP’s only wrong call was on New Hampshire, where it had Trump winning by 0.6 percent; Clinton ended up winning the state by 0.3 percent.
Baris lamented the disservice to the craft caused by low-quality, inaccurate polling.
“Ultimately, this is supposed to be a public service and nobody trusts us anymore,” he said. “If you’re perceived to be a right-wing pollster, no Democrat is going to believe the result that comes out. If you’re perceived to be a left-wing pollster, no conservative or Republican is going to believe it. That’s not the way this should be.”
He wants to try to gain public trust through transparency, showing people in minute detail how the results were collected, when, where, and from what kind of people, how they were weighted, and why. “So much of the system is left out,” he said.
That, perhaps, may convince people to seek an alternative to the legacy pollsters.
“You’re never going to beat them in a narrative war. They just have too much money and they’re just way ahead of the game on that and have more air time,” he said.
Baris stopped short of accusing pollsters of widespread fraud. Instead, he thinks they just can’t be bothered to step up their game because “if there are misfires, they generally tend to be in one direction” and that direction happens to be the one most pollsters lean toward.
“You have a status quo, there’s a lot of people making money off of that status quo, and to change it requires some people to move over and they don’t want to move over. They like their gig,” he said.