It's that time of year when we start talking about the humidity. Yes, it can get a little confusing... Humdity vs. Realitve Humidity and what about the Dew Point.? Let's try to explain...
Lots of people get humidity confused with relative humidity, or how close the atmosphere is to saturation. There are problems with relative humidity, though. The biggest being that it is relative.... For example:
A. A temperature of 65 with a dew point of 59 (comfortable range) would yield a relative humidity of 81% and a heat index of 65.
B. A temperature of 95 with a dew point of 68 (uncomfortable) would yield a relative humidity of 42% and a heat index of 100!
In our example above I'd rather be outside working in "A".
You see, the percentage really does not do a good job of letting you know if it is humid, or not. To really comprehend humidity, we must understand two facts about moisture in the atmosphere. Air at a particular temperature can hold only so much water in the vapor state. The air is saturated when it is holding the maximum possible amount of water vapor. So the term relative humidity tells us how much water vapor there is in the air as a percentage of the maximum possible amount. Relative humidity is relative to the air temperature. Warm/hot air can hold more water vapor than cold air and that is why we use the dew point in the summer to describe the humidity (how it feels to us outside). The dew point is the temperature at which moisture will condense. The higher the dew point, the more "muggy" it will be.
When the air is dry meaning the humidity is low, perspiration evaporates rapidly. This cools our bodies because evaporating water requires heat, and that heat is removed from the surface of our skin. In contrast, when the air is very moist meaning the humidity is high, evaporation of the perspiration is greatly reduced so we lose the benefit of the cooling.
Here's my Dew Point Scale:
< 55... Pleasant
61-65... Getting Muggy