A Meteorologist’s Perspective On Election Polling

The U.S. Presidential election is slowly creeping to a close, and Joe Biden has been projected the winner. He garnered the most votes of any presidential candidate in history and has a sizable (and growing) lead in the popular vote count. Many people have begun to ask if the polling was wrong again. After the 2016 cycle, pre-Election Day polling was placed under significant scrutiny. As an Atmospheric Sciences professor and meteorologist, my take on the current election projections are a bit different. Many people still do not understand probability. Trust me, we have a lot of experience with this in the weather community. I’ll explain below.

In the day or so before the election, Five Thirty Eight, a reputable outlet that evaluates polling information, gave Vice President Biden a 89% chance of winning. He won. The simple conclusion here is that their projection was right and go home. However, a question nagging me is whether people interpret that 89% as meaning he will win 89% of the electoral votes or something else. At the end of the day, Biden is projected to have 306 electoral votes and a significant popular vote margin. If you look at the graphics at this Five Thirty Eight website on the day of the Election, it shows the outcome of 40,000 election simulations by sampling 100 of them. Biden won 89 of 100 of those sampled outcomes. This is statistics y’all. Clearly there were 11 outcomes in the sample favoring President Trump victory so perhaps it should not have been surprising that some aspects of the election were close.

Nate Silver, Editor-in-Chief of Five Thirty Eight, recently tweeted, “If it’s important to convey the uncertainty in polling—and it is—I struggle to see how showing probabilities *isn’t* an important part of the toolkit.” Uncertainty is inherent to most predictions including weather forecasts, medical prognoses or elections. Silver went on to say, “But at some point talking about uncertainty without invoking probabilities is like trying to formulate sentences without using the word ‘the’.” Silver wrote an excellent post-election analysis of how polling did this cycles. He entitled it, “The Polls Weren’t Great. But That’s Pretty Normal.” The message that I take from that piece is that “It’s complicated,” but that the “polls were wrong narrative” is too heavy handed.

I will leave it up to the political scientists to debate whether Biden’s victory is a landslide. I want to put on my meteorologist cap because we feel the pain of election prognosticators being criticized. Weather prediction is conducted through a combination of computer modeling, observations, trend analysis, pattern recognition and experience. Within that basket of methods, uncertainty can creep in through observational error, model assumptions, resolutions, and misinterpretations. As such, we often convey forecast information with uncertainty – “40 percent chance of rain” or “issuance of the cone of uncertainty.”

I have written in Forbes previously on how so many people do not understand either of those ways of conveying the forecast. For example, people will often complain if it rains though there was a 30% chance of rain. Meanwhile, I am thinking to myself, “it wasn’t a 0% chance so why are you complaining.” This analogy is equivalent to people looking at Biden’s 89% chance while overlooking that there was actually an 11% chance that President Trump could win. It was not zero. By the way, this link provides background on what “percent chance of rain” is actually telling you. You might be surprised.

Another example of how we convey uncertainty in meteorology is the hurricane cone. Yet, it is often misinterpreted too. The National Hurricane Center website notes that the cone of uncertainty is, “The cone represents the probable track of the center of a tropical cyclone, and is formed by enclosing the area swept out by a set of circles (not shown) along the forecast track (at 12, 24, 36 hours, etc).” The size of the circles is designated at roughly two-thirds of the official forecast errors over a 5-year sample. You would be surprised at how many people think that if the hurricane does not go down the center of the cone, the forecast was wrong. In actuality, any location in the cone should be on alert.

There are many other examples that I could have given of how people misinterpret weather information that contains uncertainty. For example, the snowfall forecast may call for 2 to 4 inches of snow in Atlanta. If only 2 inches fall, some people will complain that the forecast was wrong. It wasn’t wrong at all. Your interpretation of it was wrong or perhaps there was a “wishcast” for the higher amount.

Weather forecasts are actually quite accurate. However, the aforementioned examples and misguided expectations on how precise forecasts can be drive a misperception that they are not. What I take away from from polling analysis is they certainly are not perfect, but if you understand how to consume them, they tell you something.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: