Voters look to political polls to understand who is winning the presidential race. But interpreting these numbers can be difficult. Diane and guests explore the art and science of political polls.

Guests

  • Andrew Kohut Director of the Pew Research Center for the People and the Press
  • Whit Ayres President and founder of North Star Opinion Research
  • Mark Blumenthal Senior polling editor of the Huffington Post

Transcript

  • 10:06:56

    MS. DIANE REHMThanks for joining us. I'm Diane Rehm. In a close presidential race such as this one, it's tempting to see poll numbers as an early indicator of which candidate could win. But experts warn not to take individual polls at face value. To talk about the art and science of interpreting the numbers, here in the studio: Andy Kohut. He's director of the Pew Research Center for the People and the Press. Whit Ayres is president of North Star Opinion Research, and Mark Blumenthal, senior polling editor of the Huffington Post.

  • 10:07:35

    MS. DIANE REHMWe invite your questions and comments. Join us on 800-433-8850. Send your email to drshow@wamu.org. Follow us on Facebook or Twitter. Good morning to all of you. Thanks for being here.

  • 10:07:54

    MR. ANDREW KOHUTGood morning, Diane.

  • 10:07:55

    MR. WHIT AYRESGood morning.

  • 10:07:55

    MR. MARK BLUMENTHALGood morning.

  • 10:07:56

    REHMAndy, let's start with you. What makes a good, trustworthy poll?

  • 10:08:03

    KOHUTThat's a complicated question. I'll do my best to answer it. First of all, in terms of the sampling, all citizens, or the population that you're covering, should have an equal or at least known probability of being selected. That is, everyone should be potential -- a potential respondent. You can't rule out a segment of the public, such as people with cellphones or some other slice of the public. So the draw has to include everyone. It has to be random. It has to be rigorous enough to get to people who are reluctant respondents.

  • 10:08:43

    KOHUTIt can't be an opt-in survey where people volunteer to take a poll. It has to be we go out and get them. That works. Opt-in gets people who are interested. People who are interested are not typical. So that's on sample. On questioning, the questions have to be fair and unbiased. They have to be even-handed questions, do you approve or disapprove, giving all of the alternatives or both of the alternatives. They have -- the questionnaire has to be structured in a way that it doesn't lead the respondent by the selection of questions and the placement of questions to certain points of view.

  • 10:09:27

    KOHUTFor example, before you ask who -- for whom are you going to vote, saying, how do you feel about the economy or how do you feel about some issue that favors one of the candidates doesn't quite -- doesn't measure up because, on Election Day, there's no -- not someone there outside of the polling booth, saying, hey, how do you feel about the economy or how do you feel about crime or whatever the issue has to be.

  • 10:09:53

    KOHUTAnd then, finally, the analysis has to be rigorous, both in terms of making sure that the sample reflects the population that you're going after, and the reading of the data and the analysis of the data has to be thorough, so we understand what's going on.

  • 10:10:14

    REHMWhit Ayres, do you want to add to that?

  • 10:10:18

    AYRESSure. I'd like to start off by saying what a pleasure it is to be back on the show with you, Diane...

  • 10:10:23

    REHMThank you.

  • 10:10:23

    AYRES...and what a privilege it is to be with Andy Kohut and Mark Blumenthal. These guys are both giants in our field, and it's a real privilege to be with both of them. I agree with everything Andy said. There are a number of other considerations. You already talked about art and science of polling. A question should never give a hint as to a correct answer. You shouldn't lead the respondent one direction or the other. And often, that's a very subtle matter of wording.

  • 10:10:52

    AYRESBut your listeners can read a question and use their common sense and ask, are they leading the witness, are they leading the respondent in a certain direction? There are a number of other considerations, too, like question order. Sometimes you can ask some questions, before you ask the ballot test, that lead people one way or the other even though the ballot test question itself could be perfectly fair and objective.

  • 10:11:16

    AYRESThat's why we believe you ought to release all the questions in a survey if you're going to release results of a survey so that the intelligent consumer can go through there and say, were they leading the respondents before they got to the critical questions?

  • 10:11:31

    REHMAnd to you, Mark Blumenthal, in the past, how reliable have polls been? How clearly did they indicate a winner and a loser prior to an election?

  • 10:11:47

    BLUMENTHALWell, I'm going to take my personal privilege and thank Whit for the kind words. And thank you, Diane, for having me on as a guest.

  • 10:11:52

    REHMCertainly.

  • 10:11:53

    BLUMENTHALIn presidential elections over the last 30 or 40 years, polling has become more accurate over time, not less. With all of the challenges clearly facing the field over the last decade or so, the last two presidential elections, two or three in particular, were really remarkably accurate. They were collectively as accurate as you would expect from a true random sample, even though there are many reasons why we might doubt that we've been able to achieve that level of perfection.

  • 10:12:23

    BLUMENTHALAnd they had relatively little bias. That is, if you looked at the average margin in 2008 and 2004, separating the Democratic and Republican nominees, the final result was very close to that. That's not always been the case. But it's a sort of a paradox because, at the same time, we have all these challenges that I know we're going to get to in the rest of the show.

  • 10:12:45

    REHMMm hmm. Well, talk for a moment about aggregating polls and how important that is.

  • 10:12:54

    BLUMENTHALWell, that's something that I bring to this discussion that we try to do. If the only difference between polls were random sampling, if all of them were doing equivalent samples of the true electorate -- and I don't think there'd be any controversy whatsoever about aggregating polls -- we'd be reducing the random error by looking at lots of them because any one poll -- we use this term margin of error.

  • 10:13:19

    BLUMENTHALAny one poll, because it's a sample and not the full population, has a certain amount of variability that comes sort of automatically and built-in, and if you're able to look at, say, 10 polls instead of one, you can reduce that variability. The question that some of my colleagues have about putting them all together is that sometimes there are very, very high quality polls like the ones produced by the Pew Research Center and others that to be kind sometimes draw the trade-off line into different place as to what kind of...

  • 10:13:50

    REHMWhat do you mean, to be kind, and what do you mean, draw the trade-off place?

  • 10:13:54

    BLUMENTHALSure. A good example is that over the last 10 years, we've seen a real explosion in surveys done using an automated methodology. So that's where the calls are made not by a live interviewer but by a recorded voice, by a computer that dials the phones automatically. And when the potential respondent picks up, they're asked to answer questions by pushing buttons on their phone.

  • 10:14:22

    BLUMENTHALThere are a couple of drawbacks with this, and I'll try to do it really quickly. The pollster can't pick a random person within the household. They typically just take whoever answers the phone. There's no interviewer there to clarify the question or help the respondent if they're confused. And there's at least a theory that the response rates are lower.

  • 10:14:41

    BLUMENTHALOn the other hand -- and I -- we have a mutual colleague who likes to say -- Scott Keeter, who's the director of -- I'm going to mangle his title, but leads the research at the Pew Research Center, who says, despite violating virtually every rule that we were taught, that I was taught by people like Andy Kohut, these surveys have done relatively well over the last two or three election cycles in getting the horserace numbers right at the end.

  • 10:15:07

    BLUMENTHALAnd there's one other big, big problem that they're facing this year which is that they're prohibited by federal law from calling cellphones. And that may not have been as big a problem four and eight years ago. It is appearing to make a bigger difference this year.

  • 10:15:21

    REHMMark Blumenthal, he is senior polling editor for the Huffington Post. Whit Ayres is president of North Star Opinion Research. He's a Republican Pollster. Andy Kohut is director of the Pew Research Center for the People and the Press. You are welcome to join the conversation. Call us on 800-433-8850. Send us your email to drshow@wamu.org. Whit Ayres, to you, Andy Kohut talked about the importance of the sample. How do you select your sample?

  • 10:16:09

    AYRESYou select samples with one of two ways. Generally, you'll use a random digit dialing methodology where you randomly rotate digits off of known working telephone numbers. Another way to do it is from a list of people who have voted in a particular primary, for example, or in a congressional district, and you select randomly from within that list the people who voted. There's no one right or wrong methodology. It depends upon the race you're polling.

  • 10:16:42

    AYRESFor example, if you're polling a highly gerrymandered congressional district, say one that runs along an interstate either side for a few miles, then it's almost impossible to do a random digit dialing sample responsibly of that district. You're much better off using a list of past voters in that district, matching their names with telephone numbers and calling them randomly.

  • 10:17:07

    REHMSo what do you likely to get if in the case of the gerrymandered district, you simply do use a randomly selected group? What do you likely to get from that?

  • 10:17:21

    AYRESAre you talking about randomly selected from the list of people who voted in that district...

  • 10:17:25

    REHMNo.

  • 10:17:26

    AYRES...you're talking about random digit dialing sample?

  • 10:17:27

    REHMExactly.

  • 10:17:28

    AYRESPeople don't know what congressional district they're in. You ask them, are you in the fifth district? Are you in the sixth district? Are you in the fourth -- they don't know. And so it's very difficult to get a valid sample in a gerrymandered district from random digit dialing sample. There are ways you can jerry-rig it. But we feel like you...

  • 10:17:48

    REHMJerry-rig the gerrymandering.

  • 10:17:50

    AYRESThat's right. But we feel you're much better off getting a list of people who've actually voted in that district and are registered on the voting rules within that gerrymandered district.

  • 10:17:59

    REHMMark, what's wrong with that?

  • 10:18:00

    BLUMENTHALI was -- well, I was prepared to say what I think is right with that. I think what some would say is wrong with -- the one drawback with voter lists is that they sometimes they don't -- when you register to vote, you usually don't give the registrar your phone number. Most states don't ask for it.

  • 10:18:17

    BLUMENTHALSo when you get a list of registered voters from the state or from the District of Columbia, you don't have phone numbers, and, as Whit says, you have to go out into a commercial service and have the names match the phone numbers which is an imperfect process. And sometimes you can be missing a large number of phone numbers.

  • 10:18:33

    REHMMark Blumenthal, he is senior polling editor of the Huffington Post. I see we have many calls. We'll take a short break, talk a bit further and then take your calls. Stay with us.

  • 10:20:04

    REHMAnd welcome back as we talk about political polling, how it's done, who is sampled, what the questions are, the overall methodology. Whit, you mentioned likely voters in your voting sample because you are looking at people who have voted previously. How do you make sure that you're getting an equal number of both political parties when you do that?

  • 10:20:42

    AYRESOne of the most important factors to look at when you're comparing polls is the party balance in the sample. It's perfectly possible using very professional techniques to do monthly polls where one month, in a national survey, you have equal numbers of Democrats and Republicans, the next month, you have 10 points more Democrats and Republicans and the third month, you're back to equal numbers.

  • 10:21:08

    AYRESI will guarantee you that Barack Obama's numbers and all the Democrats' numbers go up by 10 points when you go from an equal balance to plus 10 Democrat. Then they go down by 10 points the next month when you have equal numbers of Republicans and Democrats, when nothing has changed. Fortunately, for national surveys, we have exit polls that go back three decades now.

  • 10:21:31

    REHMBut, of course, that's after the fact.

  • 10:21:33

    AYRESIt's after the fact, but it's a pretty good example of what actually happened in an electorate. And we know that Democrats have outnumbered Republicans by a maximum of seven percentage points. That's happened once in 2008. We know that Republicans have never Democrats in the last 28 years. The best they've done is even. That happened in 2004.

  • 10:21:57

    REHMAll right. And, Andy, he was talking about exit polls. What about the question of who turns up at the polls of those who have been polled?

  • 10:22:12

    KOHUTI have to address this partisan issue.

  • 10:22:16

    REHMDon't address the partisan issue right now. Just go ahead and answer me on the question of who has been polled, shows up at the voting polls.

  • 10:22:32

    KOHUTWhat we do is we have 10 questions, which we know are correlated -- and I don't mean speculate, I mean, no -- are correlated with voting. And we put those questions together in a scale, an arithmetic scale and we predict each -- the probability of each responded voting. Now, how do we know? We know because we go back after the election, and we look up the respondents' voting behavior.

  • 10:23:04

    KOHUTSo in -- this is the only case where you have a question and you can actually look at how well that question does in predicting the behavior of a respondent. There's no speculation.

  • 10:23:16

    REHMIs there one question that always says, do you plan to vote on Election Day?

  • 10:23:24

    KOHUTThere is that. There is the question of how much thought have you given the election and how, you know, where people go to vote...

  • 10:23:31

    REHMOK.

  • 10:23:32

    KOHUT...you know, if early voting's a possibility. We have a whole range of questions. We have eight questions or nine questions. I've lost track.

  • 10:23:38

    REHMOK. Mark.

  • 10:23:39

    BLUMENTHALThis is the real challenge and the place where the science of survey research meets the art of political polling. One of the big problems is that that one question, do you plan to vote, and most of the rest are a little like asking people the week after New Year's, are you planning to keep that resolution you've made, to exercise more and eat better? And, of course, people are going to say, of course, I am, even though many of them will not.

  • 10:24:05

    BLUMENTHALAnd there's a tendency on the part of most -- on the part of many Americans to say, yes, I'm planning to vote or, yes -- or even, yes, I voted, when they didn't. So the procedures that Andy's describing, that the Pew Research Center uses, which are very similar to the Gallup likely voter model that's gotten a lot of attention, involved trying to put a reality check on those answers and kind of dial back a little bit on the self-expressed enthusiasm about voting.

  • 10:24:34

    REHMBefore we go any further, can you explain to me because we've seen some new polls this morning that in the very important key state show President Obama up by five points in Ohio, in Pennsylvania and one other the state, whereas Gallup is showing a very different picture. How do we figure that out? We have invited Gallup to be on the program with us this morning. We haven't heard from them yet.

  • 10:25:14

    BLUMENTHALWell, you know, this is where I rely on looking at a lot of different polls. We called it aggregating before. You can just call it looking across the spectrum of the data available. As of -- over the last 24 hours, there were something like eight surveys that were released. Now, they run the gamut from very high quality to surveys done without a random sample on the Internet. But among the telephone surveys, there was Gallup.

  • 10:25:41

    BLUMENTHALThere was another telephone survey done by Investor's Business Daily that had Obama up by six. There were four more that were somewhere in between. There were two, the NBC/Wall Street Journal poll that had an absolute tie, unbalanced. My recommendation to ordinary people is to not obsess about any one survey as the received truth of public opinion at the moment but to try to look at all of them.

  • 10:26:04

    BLUMENTHALAnd on thing to remember now is if that the race is really tied. Let's just assume for a moment that our aggregate is correct, that it's really a 47-47 tie. Then surveys that have a margin of error, plus or minus three, are going to fall randomly around that total. Yet we find it irresistible to write narrative about up two, down one and see great differences in meaning between those, when, in reality, they're all telling us the same thing.

  • 10:26:30

    REHMAndy.

  • 10:26:31

    KOHUTPart of the issue with Gallup is it is not a single survey. It's a moving average of week's worth of interviewing. So a goodly portion of what they're reporting today or last week was conducted prior to the second debate. And therefore, it's not surprising that Romney would be doing better and Obama worse in a poll that has some significant portion of the interviewing that was done.

  • 10:26:57

    REHMBefore...

  • 10:26:58

    KOHUTBefore...

  • 10:26:58

    REHM...the second debate.

  • 10:26:59

    KOHUTBefore the second debate.

  • 10:27:00

    REHMWhit Ayres, how do you see it?

  • 10:27:01

    AYRESI'd like to strongly second the motion that Mark made, that the way to get a real sense of public opinion is to take an average of various polls. The average or the model that Mark does is available on pollster.com. RealClearPolitics also does an average. But that washes out some of the random variation you get, and I think that's probably as close to the real number that you're likely to get, given the fact that there's a range of quality in those polls.

  • 10:27:33

    REHMMark, do you see an equal number of Democrats and Republicans being polled?

  • 10:27:41

    BLUMENTHALNo. And we shouldn't because we should see voters being polled in, you know, in ways that represent their true composition in the electorate. But to get to the point about what exit polls show about party and what other polls show about party, I think it's important to remember that party identification, just the way pollsters do, is something different than how you maybe registered to vote if you're in a state that has registration.

  • 10:28:05

    BLUMENTHALIt's about how you feel. It's about, you know, the party you identify with, and in some ways, it can be a measure of your sense of the race just like your vote choice is. And the political scientists who have studied this have found that party identification, to the extent that it does move for some people, tends to move around elections. I know the Pew Research Center has projects. And I see Andy getting ready to jump out of his chair about this, so we should let him talk.

  • 10:28:28

    REHMYeah, he sure is.

  • 10:28:30

    KOHUTYeah, this notion that party affiliation is a fixed attitude that you can correct is wrongheaded, in my view, because what we know about party affiliation is that it varies with what you're questioning people about in terms of their response. If I take you through an interview, and all of a sudden you've become more inclined to vote for Mitt Romney, and you tell me Romney, Romney, Romney, Romney, and at the end of the interview, I say to you, Diane, do you think of yourself as a Republican or Democrat?

  • 10:29:03

    KOHUTYou are much more likely to say, I think of myself as a Republican than if I'd interviewed you four weeks earlier where you had said all of those things about Obama. So part of the variation by party affiliation, and I'm going to give you an example of this, has to do with the way the election is going. And if you adjust by it, you are adjusting by what you're set out to measure. So in '06, when the Democrats did very well, they had a 49 to 41 percent margin on party affiliation.

  • 10:29:40

    KOHUTNow, by '10, that had shrunk to 47 to 43 in an election where the Republicans were going to do very well -- that is, in 2010. If you adjusted the polls based upon either exit polls or previous opinion polls based on party affiliation, you're standardizing by what you're measuring.

  • 10:30:01

    REHMWhit.

  • 10:30:02

    AYRESWe all know that party ID can change over the course of an election, but it tends to change very gradually. You're not going to have party ID be 10 points more Democrat than Republican on one week and three weeks later be five points more Republican than Democrat. And we know from exit polls that if there's a poll out there that has more than 7 percentage points more Democrat than Republican in the sample, that's outside of anything we have seen in the last 28 years.

  • 10:30:37

    REHMSo what does that say to you?

  • 10:30:38

    AYRESWhat it says to me is that we should adjust our samples by an average party ID over the course of an election cycle.

  • 10:30:48

    REHMDoes that make sense to you, Mark?

  • 10:30:50

    BLUMENTHALI'm less comfortable adjusting polls to the exit poll just because it's a different survey with a different way of measuring. One of the things we've focused on -- I mean, I want to say that there is a lot that -- there is a great deal that Whit says that I find -- I'm in agreement with, one of which is that if you do see a one-survey switch from -- see a big change in its party composition, which I think it's fair to say the Pew Research surveys between September and October did, it is saying something about the composition of the likely electorate.

  • 10:31:22

    BLUMENTHALAnd I think Andy would probably grant that much of that change was about changes in the enthusiasm that made for a different kind of likely electorate after the first debate than before.

  • 10:31:34

    BLUMENTHALBut I'd throw out one little thing, which is that if you look at the exit poll results from four years ago, you'll typically see a much bigger percentage -- or, I'm sorry, a much smaller percentage who say they identify with neither party than typically identify as independent or no party on the telephone surveys before. And that's kind of to be expected because you've -- after you voted, you just committed the most partisan act that most people do every four years.

  • 10:32:00

    REHMWe're getting several emails along this line, which say, "While candidates, journalists and pollsters love polls, how do polls help voters? It seems they're a distraction and do nothing to improve our ability to choose the best candidate. Some countries even ban publishing poll data close to an election." Andy.

  • 10:32:30

    KOHUTWell, the election is a process by -- how Americans make up their minds, and we want to just get up on Election Day and decide, or they don't get up on Sept. 15 and decide. It's a process, and telling the story of that process is a piece of political journalism.

  • 10:32:51

    REHMAnd you're listening to "The Diane Rehm Show." But here's another question along that line. "Please discuss the impact of reporting poll results over and over, often long before elections. Do the polls adversely affect any races? Is there any way to lessen the possibility that polls actually discourage voting?" Mark.

  • 10:33:22

    BLUMENTHALI'll -- before I answer it, I just want to throw out one anecdote, which is that as of this morning on the Huffington Post version of pollster.com elections, not huffingtonpost.com, if you look at our current national average, there are less than one-tenth of 1 percent separating Obama and Romney. Now, if -- we'll get to the possible detrimental impact of polls, but if voters can take one message from that at the moment, it's that your vote will count this election as much as it has ever counted. And you have, you know, reasonably scientific evidence of that.

  • 10:33:57

    BLUMENTHALI have not seen from the world of political science much evidence that the proliferation of polling has had any impact on preferences. Now that may just be because they haven't done enough work to prove it. I get feedback every time I write an article on the interest level that more partisan voters have in polling. They love it. They cannot get enough of it, and our traffic lights up around that.

  • 10:34:27

    BLUMENTHALAnd I think that's because people who love politics tend to follow it a little bit like sports. They want to know how their team is doing. They want to know if their side is going to win. I don't know how that helps ordinary or, I should say, swing voters and those who are trying to make up their minds. I would say there isn't much help there for the true uncommitted voter.

  • 10:34:48

    REHMDo any of you expect to see the vote as being so close that, in fact, you may get the popular vote going one way and the Electoral College going another? Andy.

  • 10:35:04

    KOHUTI think it's a distinct possibility. I mean, as Mark pointed out, the polls are so close at this point. People are so conflicted about these candidates. They fault Obama for his performance. They're still unsure about Romney in personal terms. So it could be that we will have a very close popular vote. But in terms of key states, it could go one way or the other.

  • 10:35:32

    REHMAnd, Whit Ayres, here's another email from Mary Jane, who says, "If many people are voting early and most of the early voters are Democrats, doesn't this skew the polls? If you take more Democrats out of the potential voter group, won't the polls appear as if Romney were gaining ground when he is not?"

  • 10:36:02

    AYRESNot if the poll is well done. I think all of us now, now that early voting has started, start off with a question about do you intend to vote, or have you already voted, or do you intend to vote early? And so what you should do is pick up the partisan preferences, the vote of the people who have actually voted early already. There is substantial evidence that people who vote early tend to be the strongest partisans, the ones who made up their minds months ago about who they're going to vote for. And so a well-designed poll should not be skewed by substantial early voting.

  • 10:36:44

    REHMDo you agree with that, Mark?

  • 10:36:45

    BLUMENTHALI've seen no survey that excludes early voters from their samples. If any did, then you would certainly have that effect. I think the more complicated question is the degree to which pollsters have learned how to incorporate early voting correctly. As Andy -- as we said earlier, we've learned lessons over 40 or 50 years about how to discount the tendency of voters to overstate their likelihood of voting. But the questions that we're now asking that identify who's voting earlier, a relatively new one, we don't have as much experience with them.

  • 10:37:19

    REHMBut do really polls shape public opinion?

  • 10:37:25

    BLUMENTHALI think that they do very rarely. I think that if I -- I hear the music, so I think we may have to go. But the one place I think they do is in multi-candidate primaries where sometimes the polls show somebody moving up, and it helps people decide, if they have several candidates they like, who has the best chance of winning.

  • 10:37:43

    REHMAll right. Short break. We'll be right back.

  • 10:40:04

    REHMAnd just before we go to the phones -- an email that represents a number of others. This one from Andrew, who says, "I'm hoping your guest could speak their opinions on Nate Silver of FiveThirtyEight fame and his statistical approach to polling interpretation. Specifically, he takes into account economic data and what he calls state fundamentals. Is this valid? It seems that that should be represented in the polls already." Mark.

  • 10:40:48

    BLUMENTHALI wonder if the host of "Meet the Press" is asked what he thinks of the host this week with ABC News. I get asked about Nate Silver a lot, and I suppose, in some sense, what we do is a little competitive. So I think that what...

  • 10:41:04

    REHMHaving said that...

  • 10:41:05

    BLUMENTHALRight. What he does is an aggregation which is not different than ours in our model. It has -- what we're doing has a lot in common with what he popularized in 2008. And what he does draws off of the averaging that RealClearPolitics and the rest of us did. Nate adds one sort of step to this, which is to build in other data, data on the economy and -- which gets more into the realm of forecasting which has a long storied history. Political scientists have done this for a long time.

  • 10:41:39

    BLUMENTHALIt's not a bad thing, but it's different. It sort of mixes -- for me, it mixes polling with assumptions about what may or may not affect the vote. And I find it a little harder to follow although, if you look at the bottom-line numbers, they're about the same in terms of our estimates...

  • 10:41:53

    REHMInteresting.

  • 10:41:53

    BLUMENTHAL...for most states of the nation.

  • 10:41:54

    REHMAll right. To Manassas, Va. Good morning, Elizabeth. You're on the air.

  • 10:41:59

    ELIZABETHYes. Good morning. This is a fascinating show.

  • 10:42:02

    REHMGood.

  • 10:42:03

    ELIZABETHAnd I have tremendous respect for pollsters. I must admit I get very frustrated with the media and would be interested in your guests' response to this as well as yours, Diane, because most of the time, what we hear reported as a lead story, either on the front page of a newspaper or the lead story in a news show, is something like that Gallup poll which is a national poll, which really has nothing to do with who we're electing president since we elect presidents based on the electoral college.

  • 10:42:37

    ELIZABETHAnd you frequently have to go five or six pages into a newspaper before you get reports on the battleground states which are the things that will probably actually decide the election, and I don't know why that the national media is so hung up on reporting national polls.

  • 10:42:56

    REHMWell, we've been trying to talk about a variety here...

  • 10:43:01

    KOHUTI think.

  • 10:43:02

    REHM...this morning, Andy?

  • 10:43:03

    KOHUTI think the answer to this is that, first of all, the media is increasingly covering battleground state polls. We look at The New York Times, The Washington Post and other online news sources come -- the combative states, the competitive states are regularly polled. And, secondly, with respect to national surveys, the trends that national surveys detect and pick up are manifested in the battleground states. So to trend away from the trend to Romney is -- nationally is now being seen in the battleground states.

  • 10:43:42

    REHMMark.

  • 10:43:42

    BLUMENTHALI have two reasons that I personally pay attention to national polls in this context. One is, frankly, that they're of higher quality. The national pollsters tend to spend more money on their sampling and work harder to get respondents on the phone. And so it's an interesting comparison. But also to the point about you can draw from the national surveys information about the states, that is something that Nate Silver did rather well four years ago and that we're doing with our model this year.

  • 10:44:15

    BLUMENTHALAnd I'll just throw out some data. I have a spreadsheet that I've been looking at every morning for a couple of weeks where we take the results of the nine battleground states where the campaigns has spent the money in and estimate what their combined vote is. And as of this morning, it is a little bit better for President Obama about a percentage point better, maybe a little less than that but not very different.

  • 10:44:38

    AYRESAnd we got a great example of what both Andy and Mark are talking about with the first presidential debate that was wildly viewed as a big win for Mitt Romney. If you look at polls the week before that debate and compare them to the week after, nationally it looked like it was about a five point net move toward Romney. He'd been about three-and-a-half down. He was about a point and a half up afterwards.

  • 10:45:03

    AYRESBut that was followed by that same change going on in the battleground states. So while Elizabeth is right that it's going to be determined by the battleground states, in the case of Romney, a rising tide lifted all votes.

  • 10:45:17

    REHMAll right. To Syracuse, N.Y. Hi there, David.

  • 10:45:21

    DAVIDGood morning, Diane. Pleased to speak with you.

  • 10:45:22

    REHMThank you.

  • 10:45:24

    DAVIDI wanted to inquire about the prevalence of slanted polls and to what degree they reflect on the candidates that sponsor them. The context being two years ago when local representative, the incumbent Daniel Maffei, Democrat, was being challenged by the Republican Anne Marie Buerkle, I received -- who ultimately won.

  • 10:45:44

    DAVIDI received a poll with questions that were crudely slanted in favor of Buerkle. Got to a point where they asked the fact that Dan Maffei had voted for the Obama health care plan, would that make me much less likely to vote for him, somewhat less likely to vote for him or not affecting my decisions? I said, well, actually, it would make me more likely to vote for him. The pollsters said, well, that was not one of the possible options.

  • 10:46:12

    DAVIDI had to answer one of their ways, or I'd be rejected from the poll. And so I refuse to lie so I was rejected from the poll. A week or two later, I got another poll, again, slanted questions for Buerkle but no more deal breakers, and I've wondered if they deliberately called me back to get me in the poll because they rejected me before. Is that a courtesy, or was that unethical of them to have a non-representative sample? So...

  • 10:46:37

    REHMMark.

  • 10:46:38

    BLUMENTHALThere are a bunch of questions there. I'd love to answer all of them. I bet I'm not going to have time. Let's back up and talk about the kind of poll that the listener got which was almost certainly the kind of survey that Whit gets paid to do, and that when I used to do this as a Democratic campaign pollster, I got paid to do, which is that they were -- they're doing a real survey but they're testing messages -- that's the term of art.

  • 10:47:00

    BLUMENTHALThey're trying to figure out what kinds of things they can say in campaign ads, in mailers, on the stump, that will have the greatest impact, and some of that is the testing of negative information. So that question that the listener recalled hearing, would this make you much more likely, much less likely, somewhat less likely and so on, is a very standard kind of question. And the people that receive these calls are often quite troubled by them.

  • 10:47:27

    BLUMENTHALThe specific that he mentioned, which was someone not giving him the opportunity to say more likely, strikes me as unusual. That would be unethical and dumb practice because you'd want to get the...

  • 10:47:39

    REHMAndy, you want to comment?

  • 10:47:40

    BLUMENTHAL…respondents' answer.

  • 10:47:41

    KOHUTThe only comment I have is that some of these questions -- some of these questionnaires are so -- these questioning is so obvious that the only conclusion responding can come to is that there's something not right here. And one of the benefits these days for legitimate polls, all of the questions are on the Internet. And if you want to find out how Gallup or Pew or NBC asked a particular question, you can look it up. If you can't look up what the question wording is, you have a right to be suspicious.

  • 10:48:21

    BLUMENTHALAnd...

  • 10:48:21

    REHMAll right.

  • 10:48:22

    BLUMENTHAL...you'd never bump somebody out because they don't give the right answer. That's just flat wrong.

  • 10:48:27

    REHMYou'd never, but somebody did.

  • 10:48:30

    BLUMENTHALWell, yeah, and that's just a bad poll.

  • 10:48:30

    REHMYeah, clearly. All right. To Lexington, Ky. Jodie, you're on the air.

  • 10:48:39

    JODIEHey, Diane. Thanks for taking my call.

  • 10:48:41

    REHMSure.

  • 10:48:42

    JODIEI was thinking about the phone-based surveys, and you're saying that they can't call cellphones. I would say that people who's -- only have landlines now are probably going -- continue to be more conservative.

  • 10:49:03

    REHMDo you agree with that, Andy?

  • 10:49:05

    KOHUTYou're wrong and you're right. People who have landlines, as opposed to not having cellphones, are more conservative. But we do call cellphones. Forty percent of our calls are to cellphones, 60 percent to landlines. And we're going to 50-50 shortly.

  • 10:49:22

    BLUMENTHALAnd it's -- he is picking up one something that I said, which was that the automated polls are barred by law from calling cellphones.

  • 10:49:30

    REHMRight.

  • 10:49:31

    BLUMENTHALAnd it is considerably more expensive to do it this way, and it does move us into a territory where the rules are still being developed. I mean, it's a credit to the Pew Research Center. They've been, for all practical purposes, the research and development wing of the survey industry for the last 10 years in helping pollsters understand how to do this.

  • 10:49:52

    BLUMENTHALBut in fairness, we are still figuring out, you know, the days where you had one sample and everyone had an equal or known probability of being selected, which was the sort of the quick way of describing how this is supposed to work. That's gone. We have -- we're not -- we have guesses at your probability of being selected, but it's not much better than that.

  • 10:50:10

    REHMAnd speaking of guesses, here is a caller in Orlando, Fla. Hi there, Greg.

  • 10:50:18

    GREGHi. Thanks for having me on the show.

  • 10:50:20

    REHMSure.

  • 10:50:21

    GREGWe're here in Orlando, and we're getting quite a few calls, probably four or five a week at least from poll takers and surveyors. And when we ask for whom the poll was being conducted, we always never are able to get that information from the folks who are taking the poll. And, you know, we're really not sure if it's from a campaign or legitimate news source. If it is legitimate, are they usually willing to share that information?

  • 10:50:49

    REHMWhit.

  • 10:50:50

    AYRESThere's a reason why they won't tell you who's sponsoring the poll. And that is, we are very concerned that if we tell you, this is being sponsored by a campaign for candidate X, that you're more likely to give a favorable or maybe an unfavorable response to the question about candidate X. We will not even let the people who do the calls for our surveys know the sponsor of the survey because we don't want to bias the results. We want to give our clients pure unbiased answers. And knowing the sponsor, we're concerned, will affect the answers.

  • 10:51:26

    REHMAndy.

  • 10:51:26

    KOHUTWhat -- in my view, what the pollster should do in that case is say, we can't tell you now, but we promise to tell you at the conclusion of this interview who sponsored this survey.

  • 10:51:35

    AYRESI'm OK with that.

  • 10:51:36

    KOHUTYeah.

  • 10:51:37

    BLUMENTHALThere is the third category of calls that people have been getting, and that is the campaigns trying to identify voters, those who are undecided or those who are their supporters. And sometimes they will make these calls under the guise of a survey. So it sounds to the person answering the phone like this is a survey call because they ask the same questions. And I think that's a little troubling for those of us who are in the survey business.

  • 10:51:58

    REHMSo if I get a telephone call, how am I going to know whether it's a legitimate polling company or whether it's some tiny little group on some specific issue? How do I know that the call is legitimate, Mark?

  • 10:52:19

    KOHUTAsk.

  • 10:52:21

    REHMAndy.

  • 10:52:21

    KOHUTAsk who sponsors this survey.

  • 10:52:24

    REHMBut if they won't tell me?

  • 10:52:25

    KOHUTI would hang up.

  • 10:52:26

    REHMOK.

  • 10:52:26

    BLUMENTHALYeah. I think it's your right at any time to end that call if you're not comfortable. I think that even when a survey is sponsored by a campaign, the calling center that's making the call should be willing to identify itself. So the caller, if -- has any suspicion about who it is that's placing the call, they can call them back, look them up online and make sure they're legit.

  • 10:52:49

    REHMAnd you're listening to "The Diane Rehm Show." I'd like, if I could, to get some examples of questions that you, Whit Ayres, might be asking potential voters on the phone.

  • 10:53:10

    AYRESThere are number of very standard questions about...

  • 10:53:12

    REHMSuch as?

  • 10:53:13

    AYRESDo you have a favorable or unfavorable opinion of Mitt Romney, Barack Obama? If the election were held today, would you be voting for Barack Obama and Joe Biden or Mitt Romney and Paul Ryan?

  • 10:53:24

    REHMIt's as straightforward as that?

  • 10:53:25

    AYRESI mean, they're just very straight.

  • 10:53:27

    REHMUh huh.

  • 10:53:28

    AYRESBut the more creative questions involved message testing that Mark mentioned earlier where you will say, the Republican candidates says, and then you basically give an argument for why Barack Obama's economic plan is not working. It's time to try something different. The Democratic candidate says, Barack Obama's economic plan is working. It just needs more time. And then you can flesh those out.

  • 10:53:53

    AYRESThose are very creative questions that provide a lot of advice to candidates on how they can more effectively make a campaign message that will be persuasive.

  • 10:54:04

    REHMHow long does each of these calls take, Mark?

  • 10:54:07

    BLUMENTHALIt depends on their purpose, but in-depth survey can take 15 or 20 minutes.

  • 10:54:13

    REHMFifteen to 20 minutes.

  • 10:54:14

    KOHUTIf the pollster is so inclined, it can take three or four. One of -- I mean, I think that's a great question, and it speaks to the interest level in polling that always spikes this time of year. And it's the reason why you're having us on this morning, that for most people, polling is the question, if the election were held today, for whom do you vote, that we ask about the horse race. But most of the surveys that are done, whether they are done by outfits like Pew Research or by the news media or by campaigns or by other entities, is looking for beyond the horse race.

  • 10:54:50

    REHMOf course, you have what one would think are unbiased polls coming from, say, Pew or The Washington Post or The Wall Street Journal, NBC News. And you've got specifically biased polls coming from somebody like Whit Ayres?

  • 10:55:14

    KOHUTWell -- go ahead, Whit.

  • 10:55:17

    AYRESNo, ma'am.

  • 10:55:18

    REHMOK.

  • 10:55:18

    AYRESNo, ma'am. We are paid to give our clients the truth as we can best determine it. And I will tell you that the polls done by partisan polling outfits, be they Democrat or Republican, are every bit as high quality as those done by public newspaper entities. In fact, often, they have a better set of data because they're asking more questions about that particular race and that particular state than a newspaper entity would when they drop in and take a snapshot.

  • 10:55:53

    AYRESSo if I see a poll put out by a Democratic competitor of mine, I know that that's a well-done poll. I know I'm also only going to see those polls numbers they want me to see that looked like they will help their candidates.

  • 10:56:07

    BLUMENTHALThat's right.

  • 10:56:07

    AYRESBut it's a real mistake to assume that just because a poll is put out by a Democratic pollster or Republican pollster that the numbers are somehow skewed. They are not.

  • 10:56:17

    BLUMENTHALRight. Go ahead.

  • 10:56:18

    REHMAndy.

  • 10:56:19

    KOHUTYeah. And I think they also have different purposes as Whit described. They're trying to provide strategic information and tactical information for their clients. Our job is to tell the story of the election, so we are going broader and deeper in some respects. And the political polls are being used to fashion strategies and advertising and a whole range of things.

  • 10:56:44

    REHMBut how do -- how does one spill over into the larger context?

  • 10:56:50

    KOHUTWell, I think we're both looking for the image of candidates. We're both looking for what -- what's driving voter choice, what are the turn-ons and turn-offs of various candidates. We want to tell the story. They want to use it to elect their clients.

  • 10:57:06

    REHMAndy Kohut, director of the Pew Research Center for People and the Press, Whit Ayres, president of North Start Opinion Research, he's a Republican pollster, and Mark Blumenthal, senior polling editor of the Huffington Post. Gentlemen, we shall see. Thank you so much for being here.

  • 10:57:30

    AYRESThank you, Diane.

  • 10:57:30

    KOHUTThank you.

  • 10:57:31

    REHMAnd thanks for listening, all. I'm Diane Rehm.

Comments

comments powered by Disqus
Most Recent Shows

Revisiting The Decision To Drop The Bomb

Thursday, May 18 2023As President Biden's visit to Hiroshima dredges up memories of World War II, Diane talks to historian Evan Thomas about his new book, "Road to Surrender," the story of America's decision to drop the atomic bomb.