Site icon NJTODAY.NET

Climate change has outpaced the government’s drought assessment

Low water in the Dirty Devil river in Utah due to drought conditions.

Despite the acceleration of climate change, assumptions of a stationary climate are still incorporated into the management of water resources in the U.S., with a preference towards 60-year (or longer) observation record lengths for drought characterization.

Bias emerges by assuming that conditions from the early and mid-20th century are as likely to occur in today’s climate.

In a new study funded in part by the National Integrated Drought Information System (NIDIS), researchers from the Montana Climate Office evaluated the degree to which assumptions of stationarity may bias drought assessment.

The study reveals that drought assessment error is relatively low with short climatology lengths, and error (with respect to the more recent climate) can increase substantially when using longer reference frames where climate is changing rapidly. This increase in error results in patterns of drought metric bias.

“Hot spots” of dry bias—i.e., where the long climatology suggests conditions are drier than the shorter, 30-year climatology—were strongly apparent in the southeastern and southwestern US, especially during periods of meteorological drought.

Contrary to this result, some locations (such as the Pacific Northwest) are trending wetter, and exhibit a strong wet bias during dry times.

Researchers with the Montana Climate Office presented key findings from the study at the October 6 NIDIS Executive Council Meeting.

Climate change shifts the baseline for what should be considered “normal” hydrological conditions, therefore shifting the definition of anomalous dryness, or “drought.”

Generally, 30 years of data is adequate for characterizing a statistical climatology. Shorter reference periods (climatology lengths) based on more recent data are preferable when defining contemporary climatologies in regions with rapidly changing climate, such as the Southwest.

The reference period chosen to define meteorological drought can have a substantial impact on the apparent spatial extent and duration of drought events.

In regions where aridification is occurring (such as the southwestern United States), longer reference periods result in a dry bias, causing drought to appear more severe than if it were contextualized by more recent conditions.

Conversely, in regions that are becoming wetter (such as the northwestern United States), longer reference periods result in a wet bias, causing drought to appear less severe than if it were contextualized by more recent conditions.

Ultimately, the choice of reference period for defining drought (or any “normal” climatology) must depend on the adaptive capacity of the system in question (e.g., agriculture, ecosystems, economics).

Climate change is impacting water supplies for communities and ecosystems around the world. Several regions of North America are experiencing decreases in annual rainfall, shifts in the seasonality of precipitation, and an increased frequency of extreme precipitation events.

Yet, despite the acceleration of climate change, assumptions of a stationary climate are still incorporated into the management of water resources in the United States, with a preference towards 60-year (or longer) observation record lengths for drought characterization.

Bias emerges by assuming that conditions from the early and mid-20th century are as likely to occur in today’s climate.

Exit mobile version