Why Humidity Isn’t The Best Measure For “Humid”

Weather

This is the time of year we use words like “humid” and “muggy” to describe the afternoon. Meteorologists throw around numbers associated with humidity and dew point. But do you understand what those two terms really mean?

First, let’s properly define the two. The dewpoint is the temperature the air needs to cool to become saturated. It’s the temperature where actual dew can be created. The dew point is a measure of moisture in the air.

The relative humidity is also a measure of moisture, but it measures the percentage the air is to saturation. 100% humidity is saturated air.


So in the morning, the temperatures are cooler and closer to the dew points… this is higher relative humidity. At the hottest part of the day, the dew points still rise but are further apart from the temperature… this technically is lower humidity. But since high dew points make it feel sticky outside, we use the term “humid” to describe how we feel. While we’re not wrong to use that term, it’s just far from 100% humidity.

Here’s another way of looking at it. Both of these scenarios above are at 50% relative humidity. The hotter scenario will feel much sticker because warmer air holds more moisture. The more moisture in the air, the hotter you feel and the harder it is on our body.

The dew point is often used to describe the comfort level. When you start getting into the 70s, that’s oppressive and tough on most of us. This is why the dew point is really the better representation of moisture in the air. The relative humidity is the lowest at the hottest part of the day.

Copyright 2019 Nexstar Broadcasting, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

More Fuzz Butt Strut

Don't Miss