The world’s digital transformation has led to dire warnings that the information and communications technology (ICT) industry’s appetite for energy consumption would grow uncontrolled like the classic horror film creature, The Blob.
“There have been many alarmist predictions of growing ICT energy use over the years, and all have proven to be bunk,” Eric Masanet, engineer at Northwestern University and co-author of an International Energy Agency (IEA) report on digitalization and energy, told Nature.
That isn’t to say that energy usage is not an ICT concern but that the industry – much like the town folks rallied to beat The Blob – have improved energy consumption, especially in data centers, to help control operating costs, extend the life of existing facilities and equipment, protect the earth’s environment, and meet growing regulatory mandates.
“With the specter of an energy-hungry future looming, scientists in academic labs and engineers at some of the world’s wealthiest companies are exploring ways to keep the industry’s environmental impact in check,” said Masanet. “ICT’s energy use must be “vigilantly managed” .. but if we stay on top of it, we should keep future energy demand in check.”
There is even hope that ICT can even contribute to a more sustainable future.
“The ICT sector has been transforming from within and allows other sectors to better prepare to meet their green objectives,” writes UK B2B publication The Verdict. “Research also shows that the ICT industry is enabling many other sectors to reduce their carbon footprint. The World Economic Forum (WEF), for example, finds, by 2030, ICT technology will help reduce industrial emissions by 12.1 billion tons, nearly 10 times the amount emitted by the ICT industry.”
If there is actually a “Blob”, it's the world’s demand for more and more data, especially as we embrace everything from 5G to VR to IoT, all of which require more space for data storage and processing.
Consider these facts from the United States International Trade Commission:
And yet, Statista found that energy demand in data centers worldwide from 2015 to 2021 actually remained flat as traditional data centers gave way to cloud and hyperscale solutions.
“Traditional data centers globally have decreased their energy demand, from around 97.6 terawatt hours in 2015, to some 50-terawatt hours in 2019, and a forecast indicated that this figure will reach nearly 33 terawatt hours by 2021. On the other hand, hyperscale data centers have doubled their energy demand in the same period,” said Statista.
Overall data center usage in terawatt hours, according to Statista, went from 190.7 in 2015 to an estimated 190.81 last year.
The ICT switch from traditional, often smaller and inefficient, data centers to larger hyperscale models has help with energy consumption.
“Data center electricity demand has remained roughly level over the past half-decade, in part because of the “hyperscale shift” – the rise of super-efficient information factories that use an organized, uniform computing architecture that easily scales up to hundreds of thousands of servers,” says Nature.
The shift away from smaller data centers has helped control ICT’s “Zombie
problem – a 2015 study famously found that among 16,000 servers in corporate IT closets and basements, about one-quarter were “Zombies”, still plugged in and sucking up power but providing no useful output.
“These are servers sitting around doing nothing except using electricity, and that’s outrageous,” said the report author Jonathan Koomey.
The continued shift towards cloud computing and moving storage and applications away from traditional data centers has tremendous potential to keep ICT energy usage in check.
ABB Reviews said on average, one hyperscale data center server is able to replace 3.75 conventional servers.
For medium to smaller-size data centers, every watt certainly counts, and the following trends are cutting energy consumption: