Article - July 13, 2018
Defining a heat wave can vary depending on season, place, and what typical conditions are in both. For example, from Texas to Florida on the Gulf Coast, temperatures above 90°F (32.2 °C) are quite common during summer, so temperatures would need to be much higher and last longer to be considered a heat wave. Whereas 90°F (32.2 °C) in the north, where hot temperatures are rarer, would easily fit the bill. Makes sense, right? Not so fast, it gets a bit more complicated than that.
The U.S. National Weather Service defines a heat wave as: A period of abnormally and uncomfortably hot and unusually humid weather, typically lasting two or more days. Note the word, “uncomfortably.” This makes things relative and relies greatly on the availability of one of the greatest contributions to mankind: air conditioning.
The last factor when defining a heat wave is the heat index, which measures what the temperature feels like rather than what it actually is by combining the temperature with the level of humidity. This is why, in places of high humidity, the heat index must be taken into account before declaring a heat wave. Although the temperature may only be 25°C (77.0°F), it may feel much hotter, or more uncomfortable. This hotter feel is due to the fact that humidity makes evaporation (in this case, sweat) less efficient, making it more difficult to naturally cool down your body.
In an attempt to understand what is considered ‘uncomfortable’, here is the general criteria that varying places in the U.S. and Europe use to define a heat wave.