pilipiliĀž»­

 

How pollsters decode polls

While we're being deluged with polls now, the only 'poll' that matters is the one on May 2.

- April 20, 2011

Spring arrives on Parliament Hill.
Spring arrives on Parliament Hill.

How do we assess the wide variety of poll results weā€™ve seen during this election? We asked our resident polling expert at pilipiliĀž»­, Peter Butler, to give us some tips.

Dr. Butler ran the Atlantic office of Decima (now Decima Harris) under Alan Gregg for more than a dozen years. He also worked for Western Opinion Research, a research firm out of Winnipeg, and RDI, a local market research firm (now defunct). Heā€™s been a consultant for provincial and federal governments, and has done issues research and political polls for the Conservative Party of Canada. He is still associated with The Strategic Counsel a public issues-oriented research firm in Toronto and is currently professor emeritus of Sociology and Public Administration.

He also wrote one of only two textbooks about public opinion polling in Canada, entitled Polling and Public Opinion: A Canadian Perspective.

1) Question the randomness and design.

The trustworthiness of a poll comes down to three things: sample design, the quality of measurements they employed and confidence in the data collection process.

The first thing Dr. Butler thinks about is what kind of sample was taken. The irony of this, Dr. Butler admits, is that commercial pollsters (himself included) are notoriously secretive about their methods.

ā€œI even say to students, Iā€™m not telling you how I do it, because thatā€™s how I make my money! Itā€™s very much a closely guarded secret,ā€ he says.

That said, without knowing exactly how they do it, he trusts Nanos, Decima Harris, Angus Reid and other big name pollsters because they can afford expensive polling equipment like predictive dialers, that generate phone numbers and manage the randomness of samples.

ā€œA CATI (Computer Assisted Telephone Interview) system and a predictive dialer sorts through the supervision of the data collection very effectively,ā€ he says.

2) Ask how the questions were asked.

ā€œWhat you want to know is not just the opinion,ā€ says Dr. Butler, ā€œitā€™s how intensely held the opinion is.ā€

Find out if a poll result came from asking simply one question or if it represents a summation from several questions. For instance, instead of asking ā€œWho will you vote for,ā€ Dr. Butler normally uses 5 questions to determine a respondentā€™s vote choice.

ā€œAsking one question is not enough for me. I group my questions. And there are a number of statistical techniques that are done so that a batch of questions is asked to reflect an issue.ā€

He also likes to know when a question was asked. Did they ask it at the beginning or the end of a questionnaire? It influences responses since respondents usually try to answer consistently during the course of the interview.

3) Check if the poll was a 'one off' or part of a series of polls.

Polls can be conducted over a period of days or they can be a 'one off'. A poll should explain this part of the methodology from the outset. If they donā€™t reveal this, pollsters can sin by omission.

ā€œAre they telling you the nightly resultsā€”of a good nightā€”or do they roll up results of the end of the week,ā€ he asks. ā€œWe just collected 500 cases last night and this is what it shows. But the night before it didnā€™t show up.ā€

Dr. Butler believes polls become more accurate over time. He likes seeing how polls track responses night after night.

ā€œNic Nanos says, and I think heā€™s right, that the repeated numbers of polls that are done give you a closer and closer perspective on the accuracy of the opinions being reported.ā€

4) See error beyond the margin of error.

Margin of error tells you how accurate your sample isā€”its size relative to the populationā€”but there are many other sources of error in a poll.

ā€œPeople are not asking this question enough: what is the refusal rate on your survey? You made how many calls so you could get 500 people? Oh, 3,000. Wow. Polling companies donā€™t want to talk about it,ā€ says Dr. Butler.

Butler says increasing numbers of people are missed in polls. This is a problem. What if all the university professors decided to go on a ā€˜no callā€™ list or refuse to participate? You have demographic holes that require filling.

ā€œAlan Gregg was saying in a recent Globe and Mail article that the telephone poll is going out the window if we have a ā€˜no callā€™ list. If people have their cell phones off, or just become weary of market researchers it is going to be difficult to deliver accurate polls by telephone surveys.ā€

5) Pick out the bias in national polls.

National polls give a big picture, but the numbers are often misleading. National data is made up of regional data. Pollsters segment Canada into polling regions, however, small regions often get overlooked and this creates bias.

ā€œThe big picture is going to be stuff that emanates from the centre,ā€ he explains. The problem is, ā€œsamples get so much smaller in regions that their numbers always have a higher margin of error.ā€

If your sample numbers for a region is low, what did you do? ā€œDid you collect more people than you needed to, or did you do a mathematical calculation, called weighting the sample?ā€

ā€œBut as I tell students, mathematics is not opinion,ā€ he says. Many pollsters, being expert statisticians, do the math.

ā€œEverybody does it. How would we (pollsters) get PEI opinions if we didnā€™t weight samples in a national poll, since there is a low likelihood that many cases would be collected from PEI? Well, I have an issue with that! The issue is that it means that a PEI opinion [where] we only got 85 [responses] are coming in as having the opinion of 250. Itā€™s not 250 opinions! Itā€™s a weight.ā€

6) Understanding the undecided vote.

In the waning part of the election campaign, pollsters try to figure out which camp to put the last percentage of uncommitted voters into. As more people commit, polls get more accurate, but until then, pollsters come up with a battery of questions to better read an undecided electorate.

ā€œIs it really undecided or is it they wonā€™t tell you, and if they were going to vote tomorrow, how did they vote in the past,ā€ he says. ā€œEverybody in the business has to deal with the notion of how do we get at the undecided vote.ā€ Ā 

ā€œIā€™d hate to tell you the tricks we have used in the past,ā€ says Dr. Butler, smiling.

He really would, because that would reveal another mystery of his profession.