Not necessarily uncertainty about life in general, but hey, lessons can be learned from any forum. Let's talk about meteorological uncertainty. This is, perhaps, an unpopular topic among some weather nerds (or more to the point, popular in dimly lit smoke-filled back rooms that require passwords for entrance). I believe this is because without proper context "uncertainty" may lead some to believe we meteorologists are not good at our jobs. However, therein lies an important distinction, in my humble opinion, between uncertainty and confidence.
A question I'm commonly asked is: "Why do I need meteorological support when I can just look at my phone?" Sadly, the world is FULL of DIY meteorologists (among many other professions I'm sure, amirite!?). One of the first responses I give to this question is: "At GWG we can convey uncertainty."
So how, pray tell, does that help someone? To answer that we can turn to 2 tried and true methods:
- dictionaries
- stories
1. There's no less than 2^6 ways to define this (there are less, but that's 64 ways for those who don't want to math). For the sake of this discussion I'm going to use: "not known beyond a doubt" (taken from Uncertain Definition & Meaning - Merriam-Webster). Perhaps the first takeaway here is:
Does the presence of doubt mean that a forecast is not valid or informative?
Certainly not! In fact, I would make the case that as meteorologists, we should know the limitations of our observational networks (such as weather stations, radars, satellites, etc.) and, in the case of forecasting/nowcasting, the limitations of computational models (these are the models used to make forecasts).
Danger creeps in when we fail to make these uncertainties known to those with whom we communicate. If we "blow" a forecast that we purport to be rock solid, it likely means that we either did not adequately convey uncertainties to our clients, or in some cases our confidence or ability to interpret what we saw.
Why? I'm sure there are a bajillion reasons, one likely being societal. We are often taught that uncertainty is a sign of naiveté or ignorance. Instead, in this case it's the opposite! We're conveying our knowledge of the limitations of our observational networks and our forecast models. For confidence, one could argue that by laying out there that we have low confidence in our forecast, we may actually gain trust of our clients because they know we will not oversell our capabilities. A wise person once said: "know what you don't know."
So, that's all well and good, but what is the distinction I'm trying to make between uncertainty and confidence? For GWG, we take confidence to reflect OUR abilities. As a CCM, I'm obligated to convey to those who trust me as a meteorologist when I am not confident in my abilities to adequately fulfill their needs. Where uncertainty is limitations of our observational capacity or compute power. Essentially items beyond our control.
2. Let's tell a story. Sometimes that makes things easier. A few weeks back, there was a forecast that called for a 30% chance of rain in the afternoon. This happened to be the first evening of Lee's Summit, MO's "Downtown Days". Now, Michelle and I looked at this day's forecast and said "man...people outside at Downtown Days really need to pay attention, we are confident they are going to see thunderstorms this afternoon."
STOP! Did you see a subtle distinction there? The probability-of-precipitation forecast conveyed A LOT of uncertainty through a statistical value of "30%". However, by knowing the limitations of the observational data AND the forecast models, GWG was able to interpret that uncertainty, and use our confidence in understanding those limitations to more accurately depict the meteorological conditions that afternoon. This doesn't mean that the 30% was wrong, in fact, in a future blog article, GWG will discuss the 54,382,988 different misinterpretations we've heard for what "probability of precipitation" actually means.
I am not too proud to admit that I've been on the other side of this as well. For instance, when I'm confident that severe weather may impact an area and folks there all walk away with sunburns. #severeclear. Generally, when looking at these events, while my confidence, was high, I failed to address the uncertainty in the forecast and what the forecast models were missing. Perhaps it was a subtle inversion at mid-levels, or weaker than expected surface convergence, or maybe birds just didn't fart at the right time. The atmosphere is complex, and ALL of these things can play a role. In fact, in meteorology (and really all sciences) we are constantly discovering new processes that play a role in when, where, why, what, and how weather events affect life and property.
What I'm hoping you'll take away from this, pending you read this far #tl;dr, is that:
- uncertainty and confidence, while often related, have important distinctions
- uncertainty and confidence (or lack thereof) can make or break client communication
- One could be wrong for the right reasons or right for the wrong reasons
- meteorologists should be conveying both the uncertainty (limitations of our science) and our confidence (ability to interpret data) separately and clearly