Organizations believe their hybrid data centers are a success, though nearly one-third experienced an IT downtime incident or degradation in the last year, according to an Uptime Institute data center survey of more than 800 international data center operators and IT practitioners.
The majority of respondents, 80%, said their most recent outage was preventable, citing on-premise power failures, network failures or IT systems errors as the root of most causes. It took organizations on average one to four hours to fully recover.
Power utilization effectiveness (PUE), the preferred metric for data center efficiency, has fallen since 2007, reaching 1.58 in 2018 from about 2.5. This was an all-time low for the metric, though only 26% of respondents said they report data center energy and carbon use to corporate sustainability.
The limitations of PUE are known because the metric doesn't factor an IT department's efficiency or network equipment. PUEs can be cut in half if companies are able to make and use energy locally for their data centers.
But some newer technological initiatives are driving data center energy consumption up including blockchain, Web 3.0 applications, big data and internet of things, according to Domenic Alcaro, VP of data center software solutions for Schneider Electric. Companies unaware of the added compute power needed to fuel these technologies, will see an increase in cost.
A continuous problem with IT is "when new tech comes out, there's really not a lot of thought given to the energy consumption of that tech," he said. Though it is hard to migrate entirely to the cloud, running servers in a lesser capacity still eats up significant levels of energy and drives up costs. Companies using the hybrid method in particular need to be aware of underutilized servers' energy consumption.
Traditionally, PUEs were reliant on batteries or generators, but through the growth in hybrid solutions or full adoption of the cloud, that's changing. Microsoft is experimenting with tidal energy to give its data centers' PUEs more predictable reliability with the pattern of currents.
Google is keeping its data center centers cool by using artificial intelligence. A snapshot of the cooling system is derived every five minutes and sent to a deep neural network for analysis.
Today "there's not a huge difference between what a server draws sitting idly versus what it draws at 90% utilization," said Alcaro. If companies are able to find the sweet spot of cloud use and on-premise server use, they can drive down energy consumption costs.