Dive Brief:
- Google is employing artificial intelligence technology to improve the efficiency of its data centers, reducing energy used for cooling by 40% and improving Power Usage Effectiveness, which measures the ratio of the total building energy usage to the IT energy usage, by 15%.
- The "machine learning" technology used to optimize more than 100 variables in the data center was acquired when Google bought DeepMind in 2014 for $500 million.
- In a post to DeepMind's blog, a pair of researchers called the results a "phenomenal step forward" for data center efficiency and said it could help "greatly improve energy efficiency and reduce emissions overall."
Dive Insight:
Google's use of machine learning to improve energy efficiency of data centers has a range of implications: In the near term, as the need for data processing increases, the advances can help centers reduce costs, consume less energy and lower emissions.
DeepMind researchers maintain the implications are positive: Google's data centers are already optimized and among the most efficient, so the double-digit improvements are impressive. "In any large scale energy-consuming environment, this would be a huge improvement. Given how sophisticated Google’s data centres are already, it’s a phenomenal step forward," DeepMind's Rich Evans and Google's Jim Gao wrote in a recent blog post.
Evans and Gao said the project took historical data on temperatures, power, pump speeds and more, and used it "to train an ensemble of deep neural networks."
"Since our objective was to improve data centre energy efficiency, we trained the neural networks on the average future PUE," they wrote. "We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data centre over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints."
This isn't the only creative approach companies are taking when it comes to cooling data centers and cutting down on energy usage as energy demand for data centers is expected to double in the next five years.
Microsoft submerged a data center at the bottom of the Pacific Ocean earlier this year with the hope the cool ocean water will chill it.
Last year, a report by Future Resource Engineering focused on data center efficiency found more than 24 million kWh of potential savings in 40 locations. Those efficiencies equated to more than $3.5 million in cost savings and possible incentives, the firm said.
In June, the U.S. Department of Energy’s Advanced Research Projects Agency-Energy announced it was offering $25 million in funding for concepts focused on creating "innovative components to increase the energy efficiency of datacenters." DOE said it wanted to use new data-communications network designs and methods to double data center efficiency.
Cooling makes up a major portion of data centers' energy use, through the use of pumps, chillers and cooling towers. But the complicated environments make it difficult to operate the cooling efficienctly.
"The system cannot adapt quickly to internal or external changes (like the weather). This is because we cannot come up with rules and heuristics for every operating scenario," said Evans and Gao. "Each data centre has a unique architecture and environment. A custom-tuned model for one system may not be applicable to another. Therefore, a general intelligence framework is needed to understand the data centre’s interactions."