State Politics

These are 2 reasons why Americans have gotten skeptical of polling

Justin Vaugh, director of the Center for Idaho History and Politics at Boise State University, explains the results of BSU's latest public policy survey on Jan. 18, 2018.
Justin Vaugh, director of the Center for Idaho History and Politics at Boise State University, explains the results of BSU's latest public policy survey on Jan. 18, 2018. kjones@idahostatesman.com

Boise State last week unveiled its third annual Idaho Public Policy Survey on Idahoans’ attitudes concerning several key policy issues, including taxes, health care, education and workforce development.

Among its findings: Nearly 4 in 5 of us believe our state’s elected officials need to create an Idaho solution regarding affordable health care plans. Almost 3 out of 5 want the grocery tax eliminated. And a similar percentage think our economic future is in encouraging new tech and entrepreneurial industries, not in Idaho’s historical base of agriculture, timber and mining.

The team behind the survey proudly stands behind the results.

“We have no political agenda in pursuing these questions,” said Corey Cook, dean of the Boise State School of Public Service. “What we are trying to do is rigorous academic research that tells us what the public’s priorities are, what their preferences are on a range of policy issues.”

Cook has more than a decade of experience working on polls and surveys. He and the BSU team went to extra lengths to ensure the 1,000 Idahoans surveyed mirrored this state’s makeup for age, geography, gender and what kind of phones we own.

But that’s not always the case. Polls across the U.S. often come with a whiff of skepticism — a problem with perceived accuracy that, following the 2016 election, has only gotten worse.

Their increasing unreliability in the United States can be attributed to two trends: an increase in people using cellphones, and a decline in people willing to participate in surveys.

“(W)ith the rise of cellphones and the increasing prevalence of people who decline to answer or complete surveys, how do you ensure that those you are interviewing are in fact representative of the underlying population?” Cook said. “This is not an intractable problem, but it places a premium on rigorous methodology, which can be complicated and costly.”

The problem with phones

The rise of cellphone use itself does not lead to inaccurate polls. The problems occur when pollsters do not include cellphones in their surveys because of cost, logistics and federal regulations banning automated calls.

A 2015 National Center for Health Statistics study showed 61.6 percent of Idaho homes did not have a landline. That’s higher than the national rate (as of 2016) of 50.8 percent. And, it means landline-only polls miss about three-fifths of Idaho households.

“We did a 59 percent cellphone sample, which is a lot,” said Justin Vaughn, associate professor of political science and director of the BSU survey research team. “If you look at the information on most surveys, they are nowhere near that.

“… People who live in cellphone-only households are different than people who live in households with landlines. If you only call landlines, you will get an inaccurate picture of what people feel like.”

The well-known Pew Research Center now conducts 75 percent of its interviews via cellphones.

“We’ve actually found that it has improved the representativeness of our surveys by improving our ability to reach lower-income, younger and city-dwelling people — all of whom are more likely to be mobile-only,” Michael Dimock, Pew Research Center president, said earlier this year in a Q&A posted on Pew’s website.

But polling cellphone users is in itself a challenge because federal law bans robocalls targeting cellphones. The only way to poll cellphones is with a live person dialing, which is costlier and more time-consuming.

Live polling does allow for interaction between the pollster and respondent. Robocalls use automated scripts with no flexibility.

“(R)obocalls tend to do the least well among the various types of surveys,” said Cook.

“Ultimately, the key to survey research is that everyone in your target population has an equal chance to be contacted. Many of the surveys that are used today simply don’t have a high enough quota of cellphone-only households and as a result might skew the sample.”

The problem with participants

Carefully identifying whom you want to poll is not enough if they do not want to be polled.

The share of people who respond to randomized telephone surveys has fallen from 36 percent in 1997 to 9 percent in 2016, according to the Pew Research Center.

“A low response rate does signal that poll consumers should be aware of the potential for ‘nonresponse bias’ — that is, the possibility that those who didn’t respond may be significantly different from those who did participate in the survey,” said Dimock.

Cook said lower response rates are an international phenomenon that goes beyond just political surveys.

“We don’t really know the definitive answer, but there are lots of ideas out there — concerns about time and privacy, lack of trust in surveys, increased blocking technologies,” he said. “But this appears to have at least stabilized over the last five years or so.”

While poll participation is down, Pew found that “even at low response rates, telephone surveys that include interviews via landlines and cellphones, and that are adjusted to match the demographic profile of the U.S., can produce accurate estimates for political attitudes.”

A polling challenge for elections (like the busy ones ahead this year in Idaho) is identifying people who are going to vote. There’s no use asking nonvoters what they are going to do.

Typically, pollsters who only want to query likely voters will start with state or county voter registration files to gather names of people who have voted in two or more of the last four general elections, said Vaughn. Calls off that list then usually begin with a screening question to confirm each person is likely to vote again.

But just because someone tells a pollster they will vote does not mean they will follow through come Election Day.

“It is well understood that many people who are eligible to vote and who tell pollsters they intend to cast a ballot will not actually do so. Similarly, some people who express little interest in the election or uncertainty about voting will nevertheless turn out,” reports Pew. “Consequently, identifying who is likely to vote is fundamental to making accurate forecasts from pre-election polls and correctly characterizing the views of the electorate.”

Cook agrees that determining likely voters is a challenge.

“Much of the variation we see in public polling comes down to differing projections of a likely electorate,” he said. “It is challenging enough accurately measuring opinion. Introducing this second element, gauging the likely composition of the electorate, introduces another source of possible error.”

Some polls got 2016 right

Pollsters and pundits have been cognizant of the landline/cellphone and participation issues for several years.

But the experts are still parsing 2016’s failure to accurately capture support for President Donald Trump.

In its examination of 2016 polling, the American Association for Public Opinion Research (AAPOR) found national polls were on point, but only for the popular vote. “National polls were among the most accurate in estimating the popular vote since 1936. Collectively, they indicated that Clinton had about a 3 percentage point lead, and they were basically correct; she ultimately won the popular vote by 2 points,” states the report released last May.

But some state polls, which are more relevant for Electoral College projections, missed their mark and underestimated Trump’s support.

AAPOR identified several reasons why that was the case.

Some voters decided on their presidential choice in the final week, which would not have been reflected in polls. Others did not reveal to pollsters their true voting intentions. And many polls did not adjust for an overrepresentation of college graduates. Recent studies have found people with more education are likelier to participate in polls than those with less education.

Now, pollsters have to deal with another trend. In 2016, for the first time, millennial and Generation X voters (ages 18 to 51) outnumbered baby boomers and the Greatest Generation (ages 52 and older), according to a Pew Research analysis.

Millennial and Generation X voters comprised 51 percent of voters in the 2016 general election, according to the study.

Idaho did not hit that mark in 2016, but it got close. Millennial and Generation X voters comprised 46 percent of the state’s turnout in last year’s general election.

In Ada County, home to one-quarter of registered Idaho voters, 53 percent were under age 52.

“Generally, millennials are less attached to political party affiliations than their older peers, which has in presidential elections, and will in the future, reshape the electorate,” Cook said.

But with each passing year, as younger generations mature, that could change — as shown in last year’s voter turnout.

This is why it is important for pollsters to tap into the younger generations if they want to accurately reflect what citizens are thinking, Cook said.

“Ultimately,” he said, “millennials will have the impact in the electorate we are seeing in the workplace and a segment of the consumer market.”

Cynthia Sewell: 208-377-6428, @CynthiaSewell

Related stories from Idaho Statesman

  Comments