• By Lisa Brody

Truth in numbers: The reliability of political polling


There's a saying that numbers never lie. But numbers can be deciphered, manipulated, interpreted and misinterpreted, all in an effort to tell one story or another. Just ask a pollster or campaign manager for anyone running for public office, especially right now, as we head into the final days of the 2018 midterm general election.

In the abstract, a pollster is someone who conducts and/or analyzes opinion polls, on any matter or topic. A strong pollster provides a snapshot in time, the feelings and state of mind of those who answered the poll. A poll should not predict, but reflect, and distill down and put in perspective as one tool, one measurement, of the public's thoughts at the time it was taken.

“It is a snapshot in time – it could change tomorrow,” noted Tim Malloy, assistant director of Quinnipiac University Poll at Quinnipiac University Connecticut. “The purpose of polling is a bit like reporting, but of the public. It's taking the temperature of the public's feelings, of issues and politicians, and then relating it to government.

“It's not our intent to shape things,” Malloy continued. “Our hope is that politicians and leaders see them (the polls) and understand them, and we hope the general public sees them. It's telling the broader public what we learned from calling other Americans.”

Ed Sarpolus, founder and executive director of Target-Insyght, which has done polling for The Detroit News, Detroit Free Press, Los Angeles Times, Washington Post, and WDIV-TV, is emphatic that the role of a pollster is not just to report what they've polled.

“The nature of being a pollster, your job is to interpret the research, data and statistics,” Sarpolus said. “Sometimes it is looking at the margins of error, and sometimes it's dismissing the margins of error. It's looking at the patterns and trends, because those can be telling me something. I have to study history, other sources, and previous voting patterns (in a specific race). I'm always less interested in what the politically correct response is than in what actually is...what the data tells me.”

Polling is now a big business, with many Americans viewing polls with skepticism and distrust – or interpreting them through partisan filters and self-interests. Part of that distrust has been fostered by President Trump, who has consistently implored his followers that the polls from the 2016 election were wrong – when pollsters say they generally were accurate, with a majority of their polls showing only that Democrat Hillary Clinton was leading in the 2016 presidential contest, and in the final days, by just two or three percentage points – which is what she won the popular vote by nationally.

The Brookings Institute noted, “If you took a public opinion poll about polls, odds are that a majority would offer some rather unfavorable views of pollsters and the uses to which their work is put. Many potential respondents might simply slam down their telephones. Yet if you asked whether politicians, business leaders, and journalists should pay attention to the people's voices, almost everyone would say yes.”

The Brookings Institute asks the same question we all do – “Can we trust the polls? Under the best of circumstances, the answer is 'Not necessarily without a fair amount of detailed information about how they were conducted.'”

The Pew Research Center concurs, noting that the accuracy of a poll depends on how it was conducted.

“It all depends on the poll, and the crafting on the poll,” explained Jen Eyer, senior vice president, Vanguard Public Affairs, which provides public relations, marketing and consulting services, but not polling work for candidates. This cycle, Eyer is working with 9th District Democratic candidate Andy Levin, which covers Bloomfield Township, Beverly Hills, Bingham Farms, Franklin, Royal Oak, Huntington Woods, and part of Macomb County.

She said campaigns typically choose their own pollsters. “Polls have to be careful not to skew the results. A carefully crafted, worded poll can help predict the results of a race or ballot initiative, but it's important not to rely just on one poll. Further, it's important (for the candidates and the strategists) to know how the messages are being received by voters. But polls are a guidepost – every candidate I've ever worked with has certain core beliefs they are going to stick with. Polls allow them to know which issues are most important to voters, and what voters want to hear at that time – so you can tell them what you're going to do about certain issues they want to know about right now.”

She said a good example might be learning that tax cuts are not top of mind to the public, even if they are a central belief to a candidate – while health care concerns are very important to voters. A candidate might pivot to speaking about health care after learning that from polling to appeal to voters concerns. Once in office, if the candidate prevails, he or she might return to working on tax cuts, along with the health care issue.

“Good pollsters, especially this cycle are better, are taking deeper dives. They are not just asking questions about candidates, but about voters feelings. There are pollsters who are engaged in the process,” noted Dennis Darnoi, a political consultant with Densar Consulting. “In 2016, voters said, 'Oh yeah, I'm going to vote,' but they were as a 'likely voter,' and many pollsters did no follow up. They just took it at face value. They just did not do the deeper dive. This cycle, it's a different electorate and a different time.

“Before, you could just look at someone's voting history – you could just look at that data and not press any harder,” Darnoi said.

“What I'm seeing in '18 from trusted pollsters, is they're really gaining an understanding of the motivations of the voters,” he said. “It's really cheap to do a poll – and it doesn't have to be good or bad.

“The good pollsters, who do live polls, which are very labor intensive and expensive – those are very accurate,” Darnoi said. “But they get lobbed in with the cheap pollsters. Unless there is a clear distinction between cheap pollsters and good pollsters, it's going to be said that polling is obsolete.”

The 2016 election, where polls seemingly led pundits, media and voters to believe that former Secretary of State Hillary Clinton, the Democrat, was leading by a landslide in the presidential election, only to have Republican Donald Trump squeak by and win the presidency.

“What occurred in the 2016 general election is that every reliable poll was done in advance – at least 10 days or more in advance,” said Richard Czuba, founder, Glengariff Group. Czuba said he is currently doing the polling for The Detroit News and WDIV-TV. “Polling is a snapshot of a particular moment of when a poll comes out. Yet, races close in the final five days, the weekend before an election. I don't know of any media that can afford to do polls that close – and I wouldn't trust any that is spending money to. It would be influencing the race.”

He said that in 2016, his last public opinion polls were done on October 10 and 11.

“That's four weeks away from the election. There was no way it was meant to predict a race,” he said. “Instead, it was meant to help inform people of which way people were thinking.”

The point of polls, Czuba emphasized, is not prediction, but as a barometer.

“We need to be more responsible with public polls, because it (has the potential to) change the narrative,” he said. “It sears itself into the public mindset, and tells the voter population what is happening.

“Polls should be much more than predicting a horse race.”

This year, for the 2018 November 6 election, Czuba said he and many other pollsters are looking at “how people are viewing races through the lens of how they personally view the president. Rather than just putting out the numbers, we're putting out what is motivating voters.”

In order to accomplish that effort, Czuba, Sarpolus and other experienced pollsters are working in different ways to achieve that goal. Few are relying on automated “robo-calls,” which can only be done on landlines, per federal law. In an effort to address the ever-growing scourge of telemarketing calls, in 1991, Congress enacted the Telephone Consumer Protection Act (TCPA), administered by the Federal Communications Commission (FCC). The TCPA restricts the making of telemarketing calls and the use of automatic telephone dialing systems and artificial or prerecorded voice messages. The law was updated in 2012, and forces telemarketers to respect Do Not Call lists to landlines. Political campaigns and polling are exempt. However, telemarketing calls to wireless, or cell phones, remains illegal, and it is illegal for anyone making any call, other than for emergency purposes, to use any automatic phone dialing system or any artificial or prerecorded voice message to any cell phone. That doesn't mean all campaigns follow that law.

“The biggest change in polling in the last several years is people getting rid of their landlines – so the trick is getting them on their cell phones. Anyone looking to save a dime is getting rid of their landline,” said Dave Doyle, executive vice president of Marketing Research Group (MRG). “When there were landlines, you were pretty sure you were getting a certain geographical area – but now with a 248 area code, you could be in Los Angeles, New York, or Mozambique. The key is figuring out the right mix of cell phones, and if you have a large enough area with various demographics.”

Doyle has worked with state Rep. Mike McCready (R-Birmingham) in his three previous races for the House 40th District, as well as in his current state Senate battle for the 12th District (Bloomfield Township, Beverly Hills, Bingham Farm, Franklin, Auburn Hills, Pontiac, Clarkston, Independence Township, Keego Harbor, Sylvan Lake, Oakland Township, Addison Township, Orion Township, Oxford, and Southfield Township). “In an area like Bloomfield Township, you want to make sure you have a good mix of the community,” he said. “It's harder to do with cell phones. You have to ask more questions to make sure you have the right geographic mix. You have to have good live callers, because the best polls are with live callers.”

Doyle said that some firms get around the law prohibiting “robocalls,” or automated surveys only to landlines, by having a live dialer call to a cell phone, and then connecting it to a recording.

“We do not do that,” he emphasized.

“Anyone who is only calling landlines, and not calling cell phones – you're obsolete and irrelevant,” Malloy, of Quinnipiac, said.

Czuba concurred. “Michigan has a lot of schlock pollsters – driven by doing automated polls,” he said, “where they never call cell phones and just push a button.”

Glengariff only uses live callers “talking to real people,” Czuba said.

“Automated polls are known for under representing young people, who primarily have cell phones, rather than landlines,” Jill Alper, of Alper Strategies, said to Michigan Information Research Service (MIRS) in early October.

“Polls and surveys have to fold in cell phones as people have cut the cord on landlines,” said Arnold Weinfeld, interim director at Michigan State University's Institute for Public Policy and Social Research, noting that makes it “ever more challenging these days. Response rates are very low right now – people are shying away from surveys and polls right now.”

The advent of Caller ID on phones alerts callers to that likelihood that a call is a poll – or a telemarketer, or someone they don't know, and don't care to speak to. And increasingly, callers just don't answer the phone, meaning pollsters have to increase the amount of calls they make in order to reach a viable survey number.

Another challenge – “In a tight labor market, it's harder to hire surveyors,” Weinfeld said. “Ours rely on (Michigan State) students (as does Quinnipiac). We are competing with retailers, restaurants, fast food. We have had our own challenges – so we recently raised our wages to be competitive, in order to have enough interviewers to handle the number of calls and polls we have to make.”

Weinfeld said the right number of respondents for an adequate poll depends on what the poll is being conducted for, what methodology is being used, and what the geographic area is that is being covered.

“For our state of the state (polling), we want to get to 900, 1,000 respondents,” he said, meaning they have to make several thousands of calls to reach that many respondents who answer and fit the right demographic mix.

A striking example of how many calls are needed to reach an ever-shrinking pool of people answering is rolling polls on Michigan's 11th Congressional District (Birmingham, Bloomfield Hills, Rochester Hills, the western Oakland lakes area and western Wayne counties) by The New York Times, as reported by MIRS. Of the 66,770 calls the newspaper claimed it made in the district, MIRS reported they were only able to get 465 people to respond to their poll – a slim .7 percent response rate. Results on October 10, showed Democrat Haley Stevens leading Republican Lena Epstein, 45 to 38 percent, with a margin of error of five percent, which is considered a high percentage of error among pollsters.

In the 8th Congressional District, covering Rochester, Rochester Hills, northern Oakland County through Livingston County to Lansing and E. Lansing, the response rate was slightly higher – .9 percent, with 501 people picking up the phone out of 53,590 people surveyed. In this race, which many feel will be too close to call through election day, incumbent Rep. Mike Bishop (R) was up over Democratic challenger Elissa Slotkin, 47 to 44 percent. But noted national pollster and pundit Nate Silver of FiveThirtyEight had it much tighter, MIRS reported, over the same time period – with Bishop at 48.6 percent and Slotkin at 48.3 percent.

To many, that's a dead heat.

“Our margin of error is two to three percent,” said Michigan State's Weinfeld. “The lower the margin of error, the more reliable the poll, and the more reliable your entity is seen as a research source.

“In a close race, plus or minus five or nine is too high a rate of error,” he noted. “We're trying to be relevant so people rush out to the polls. We'd rather have as many respondents as possible, with as low a margin of error as possible.”

Doyle, of MRG, said, “The smaller the sample size, the larger the margin of error. It makes it much less reliable. If you get 300 respondents (in a Congressional race or state Senate race), with a plus or minus three percent, once you look at the smaller quantities within that, the actual margins of error go up. For example, if you have 152 within that group, the margin of error goes up. If you have 50 people over the age of 65 voting – the margin of error goes up.

“The reason people do polling is because it's generally reliable – but only if it's done right,” Doyle continued. “It needs to be a geographically appropriate size, with an accurate demographic representation. If you're polling eight precincts in the 40th House District (Birmingham, Bloomfield Hills, Bloomfield Township, the eastern portion of West Bloomfield), you can't talk to 40 percent from West Bloomfield, and they can't all be males.”

Bernie Porn, of EPIC/MRA, does primarily live polls, seeking a response rate of 600 calls statewide, with no more than a plus/minus margin of error rate of four percent. Their mix is approximately 30 percent cell phones, “and it will probably be 35 percent cell phones next cycle. You have to have cell phones to represent all voters.

“If 50 percent said, 'Yeah, it's a beautiful day out today,' that means it could be 54 percent, or down to 46 percent,” Porn said. Robocalls, he said, can be somewhat accurate only in primaries, “when you're only talking to Republicans or Democrats, or in school elections, because younger voters don't vote in those.”

He quoted a recent poll which showed Slotkin up over Bishop by four points – the margin of error.

“If there is a Blue Wave, the Democrats will rise up, and she will capture the Independents, as well, for a solid win,” Porn prognosticated, based on his polling. He said pre-existing conditions are “really hitting home for voters – it's a huge concern.” And that is Slotkin's signature calling card over Bishop.

“The federal tax cuts – people are slightly against them,” he said. “People are realizing the benefits are chump change versus the health care costs they are seeing. If Bishop loses, it will be because of that issue.”

Darnoi, of Densar Consulting, thinks that is the race that is too close to call, and that Bishop could pull it off – but only if “Mike holds on to his female voters in Rochester Hills – women who have typically identified as Republicans but may be identifying as Independents this cycle, and how well he does with ticket splitters in Independence Township.”

Darnoi said the biggest thing being seen this cycle is women who have traditionally been Republicans, supportive of Republican candidates and Republican policies, “are now in the polling, supportive of Independents, and self-identifying as Independents. It's not good for Republicans right now. It shows there's a stain on the Republican brand right now – that people do not want to say, 'I'm a Republican' right now.

“Those voters will dictate and set the tone – just because those voter aren't self-identifying as Republicans doesn't mean they support liberal policies,” Darnoi explained. “Demographics will dictate, district by district.”

He agrees that “it's hard to trust a poll with a margin of error greater than four percent – because if it's five percent or bigger, there's something wrong with the assumptions you have to make. It means your poll is unbalanced somewhere – something is off. There's not enough women, not enough women under 40, not enough people earning X. Then you're making an error in your judgement in the polling. With a large margin of error it should be discounted – it's not worth the paper it's printed on.”

In 2016, “We said polling nationally was showing that Clinton was up three percent – and it is what she won by,” Porn said. “So the polling was accurate for the popular vote. Where the disconnect was at the time was in the social media impact – emphasizing how unpopular she was, and it may have been where voting or influence was suppressed, or where there was outside influence.

“If not for that, in Michigan, I think she would have won.”

Czuba of Glengariff said they only conduct live operator polls. “We only have live operators talking to people,” he said.

With a plethora of polls being disclosed this election cycle, Czuba's advice is that “if the media is going to report it, I say, 'show me the full poll. Show me the demographics, the region, the age breakdown, racial breakdown, gender – this year you must have accurate age breakdowns. If, as a reporter, you can't see the full