Dew point may predict future energy demands | Digital Science

The power industry may be underestimating how much climate change will affect the long-term demand for electricity in the United States, according to a new study.

The study describes the limitations of prediction models electricity providers and regulators use for medium- and long-term forecasting. And it outlines a new model that includes key climate predictors—mean dew temperature and extreme maximum temperature—that researchers say present a more accurate view of how climate change will alter electricity demands.

“Existing energy demand models haven't kept pace with our increasing knowledge of how the climate is changing,” says Sayanti Mukherjee, assistant professor of industrial and systems engineering at the University at Buffalo's School of Engineering and Applied Sciences and lead author of the paper, which appears in Risk Analysis.

“This is troublesome because it could lead to supply inadequacy risks that cause more power outages, which can affect everything from national security and the economy to public health and the environment.”

“The availability of public data in the energy sector, combined with advances in algorithmic modeling, has enabled us to go beyond existing approaches that often exhibit poor predictive performance,” says coauthor Roshanak Nateghi, assistant professor of industrial engineering and environmental and ecological engineering at Purdue University.

“As a result, we're able to better characterize the nexus between energy demand and climate change, and assess future supply inadequacy risks.”

High temps, low temps

The overwhelming majority of climate scientists predict global temperatures will rise throughout the 21st century, which they expect to increase the demand for electricity as more people turn to air conditioners to keep cool.

One of the most common energy modeling platforms industry uses to predict future electricity demand—MARKAL, named after MARKet and ALlocation—doesn't consider climate variability.

And while another common energy-economic model, the National Energy Modeling System, or NEMS, does consider the climate, it's limited to heating and cooling degree days. A heating degree day is defined as a day when the average temperature is above 65 degrees Fahrenheit (18 degrees Celsius). A cooling degree day is when the average temperature is below 65 degrees.

While there are different ways to measure heating and cooling degree days, they are most often calculated by adding the day's high temperature to the day's low temperature, and then dividing the sum by two. For example, a high of 76 degrees and a low of 60 degrees results in an average temperature of 68 degrees.

The trouble with this approach, Mukherjee says, is that it doesn't consider time. For example, it could be 76 degrees for 23 hours and 60 degrees for one hour—yet the average temperature that day would still be recorded as 68 degrees.

“Moreover, choice of the accurate balance point temperature is highly contentious, and there is no consensus from the research community of how to best select it,” says Mukherjee.

Best predictor

To address these limitations, she and Nateghi studied more than a dozen weather measurements. They found that the mean dew point temperature—the temperature at which water vapor saturates the air—is the best predictor of increased energy demand. The next best predictor is the extreme maximum temperature for a month, they say.

The researchers combined these climate predictors with three other categories—the sector (residential, commercial, and industrial) consuming the energy, weather data, and socioeconomic data—to create their model.

They applied the model to the state of Ohio and found that the residential sector is most sensitive to climate variabilities. With a moderate rise in dew point temperature, electricity demand could increase up to 20 percent. The prediction jumps to 40 percent with a severe rise.

By comparison, the Public Utility Commission of Ohio (PUCO), which doesn't consider climate change in its models, predicts residential demand increases of less than 4 percent up to 2033.

It's similar in the commercial sector, where the researchers say demand could increase to 14 percent. Again, PUCO's projections are lower, 3.2 percent. The industrial sector is less sensitive to temperature variability, however, researchers say the demand could still exceed projections.

During the winter months, variations between the models are less significant. That is due, in part, to the relatively low percentage (22.6 percent) of Ohio residents who heat their homes with electricity.

While the study is limited to Ohio, researchers say the model can be applied to other states. To communicate results, the researchers used heat maps, which provide an immediate visual summary of the data represented by colors. The idea, they say, is to better inform decision makers with accurate and easy to understand information.

The Purdue Climate Change Research Center and the National Science Foundation funded the work.

Source: University at Buffalo

You might also like

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. AcceptRead More