Nation: Where the Polls Went Wrong

  • Share
  • Read Later

(3 of 4)

York Times disagree over how much impact the hostage crisis had; CBS News says not much, while the New York Times analysis says it "was a major element."

Looking for explanations of what went wrong, Wirthlin believes that the other pollsters erred by estimating that there would be more Democrats in the final body of voters than there turned out to be. He also criticizes the others for asking the key presidential-choice question first instead of last, after asking about issues and impres sions of the candidates. This, he insists, produced a pro-Carter bias.

Mitofsky disagrees strenuously with the criticisms. Says he: "I can't buy their approach to making es timates from data. I'm not prepared to throw out our techniques just because one poll produced a different number. In fact, if we were doing this all again, I would not change a single thing except to poll the last two days of the campaign. To believe their figures, too many other people have to be wrong."

But Neft at Harris thinks Mitofsky's post-election poll was wrong and was designed to explain away earlier numbers. Neft dismisses the notion that huge changes occurred at the last minute. Says he: "Nothing like that quantity and magnitude happened." He explains the Harris four-point discrepancy by citing unex pectedly low turnout among Democrats on Election Day, a view shared by Gallup.

Two basic conclusions jump out of the unhappy experiences of the pollsters. First, most of the private surveyors stopped work too early to pick up the last-minute switches, whether the change was enormous, as most now believe, or whether, in Wirthlin's phrase, "the mountain didn't jump— it slid a little." The reason that most private firms did not survey intensively right up until the last moment is simple: it would have cost too much.

The price of interviewing a single voter and then adding the data to the calculations is about $15. A major national survey usually contacts at least 1,500 people, running up a bill of about $22,500.

As it happened, only the candidates themselves were prepared to spend that kind of money time and again. Harris, for example, spent $350,000 on presidential polling from Labor Day on, whereas Caddell ran up bills of some $2 million. Wirthlin's operation spent $1.3 million and surveyed 500 people every night of the fall campaign until the last few days, when it contacted 1,000 nightly. The findings were then calculated on a rolling, three-day average, which Wirthlin contends evened out the peaks and valleys that other pollsters perceived with their single-shot surveys. Wirthlin is frank enough to admit that he had a great advantage over the public pollsters. Says he: "Their major problem was the lack of resources and lack of continuity."

In mid-October, the discrepancies between Wirthlin's findings and those of the published surveys created a near panic in the Reagan camp. Under pressure from their colleagues, Wirthlin and his assistants spent a frantic three days reviewing their numbers and techniques. They decided they were right, but Caddell, for one, still believes that they had Reagan too far ahead too early.

  1. 1
  2. 2
  3. 3
  4. 4