Now, heat is one thing. Humidity is what makes a heat wave especially tough to deal with. All of that water vapor in the air makes it hard to breathe, hard to cool off, and just downright uncomfortable. It is also during these hot spells that you start hearing a lot about the relative humidity and the dew point when you watch your local weather reports. But what does that all mean, and which one is a better measure of just how humid it is out there? That's what we're about to find out. Let the debate begin! Dew point temperature vs. relative humidity! DING! DING!
First up, what the heck is the dew point? It is the temperature to which you would need to cool the air in order to saturate it. In simple terms, it is the temperature at which dew would form if the air were cooled off. As such, it is a direct measure of the amount of water vapor in the air, and thus a direct measure of just how humid it is. The higher the dew point temperature, the more humid and moist the air is. Dew point is a parameter that always exists with the day-to-day weather. The reason we tend to talk about it more during hot weather is because it causes far more dangerous and noticeable effects in warmer air than in cold. For some more details on those detrimental effects, see my previous post "Summer Arrives by Kicking in the Front Door." For reference, here's an index for the ambient dew point and the comfort level.
So what about the other player, relative humidity (RH)? We hear about it year-round, but what exactly are you being told when you see a relative humidity of, say, 79% ? In all honesty, not a whole lot. The first problem with relative humidity is right in the name. It is a "relative" value, and is actually dimensionless, acting simply as a percentage. But a percentage of what? Relative humidity is formally defined as the amount of atmospheric moisture present relative to the amount that would be present if the air were saturated. Confusing, right? The best I can do to clarify that definition is to that the RH tells you how much water vapor is in the air relative to how much would be there if the air were cooled to the dew point. (Doesn't really clarify things does it?) Another problem with relative humidity is that it is not an absolute figure. That is, it is not a tangible figure that you can feel, touch, or measure. It is simply a mathematical ratio and a function of moisture content and temperature; the second half of the formal definition relies on temperature. As confusing as that may sound, does anything about that jump out at you? Since RH is a function of moisture content and temperature, it actually relies partly on the dew point! So when all is said and done (thanks for hanging in there, by the way...), relative humidity does not directly indicate the amount of water in the air. It does not tell you how humid the air actually is.
Then why even mention the relative humidity in a weather forecast or report? That's more of an opinion question, and is left more to the discretion of individual meteorologists. Some use it, some don't. (Personally, I don't care for it in my own forecasts.)
So they've duked it out, exchanged some haymakers, and left a black eye or two, but who comes out on top when it comes to measuring how humid the air actually is? Dew point looks like our clear winner! So, the next time hot weather rolls in to town, if you want to know how humid it is, look up the dew point temperature and see where it falls on the comfort index! After all, Summer has only just begun!
Thanks for reading!
Until next time...