Google announced that it has already been using the DeepMind AI to cut the company’s data center cooling costs by 40 percent. This implementation marks one of the first real-world applications of the DeepMind AI after Google used it to power the AlphaGo client that beat Lee Sedol, an 18-times world champion of the Chinese game "Go."
DeepMind, Beyond "Go"
So far, Google has mostly used its DeepMind artificial intelligence technology in gaming environments, where it could learn how to play the games by itself through trial and error, which is not unlike how humans learn various skills.
The games were effective in helping the AI develop human-like thinking in various environments, which is how the DeepMind AI managed to beat a world champion at a game that many experts thought was unwinnable by an AI for at least another ten years.
After the Go games, Google started talking to the National Health Services (NHS) in England about how they could use the DeepMind technology in real-world scenarios to improve healthcare. However, while we’re still waiting on the results of that collaboration, the DeepMind AI has already scored a big win for Google itself by helping the company cut its data center cooling costs by 40 percent.
The DeepMind-Powered Data Center
Google has always focused on reducing the power consumption of its data centers. It even invested significant amounts of money to power them with renewable energy to reduce the environmental impact that its large data centers have on the climate. The company said the computational power of its servers is now 3.5 times larger than it was five years earlier, but that it can accomplish it using the same amount of energy.
One of the primary uses for energy in a data center environment is cooling. The servers generate large amounts of heat, which the data center must remove for it to stay within safe temperature ranges. Data centers typically employ large industrial equipment such as pumps, chillers and cooling towers to regulate the temperature.
However, Google said that operating these cooling systems in a complicated data center environment, while taking into account external factors such as the weather, is a highly complex task. Human intuition or various formulas for operating the systems often fall short of optimal results. In addition, each data center is unique, so Google cannot apply the knowledge of how to operate one entirely to another.
Google started using machine learning two years ago to address the complex problem. However, it was only a few months ago when the company’s data center engineers began collaborating that the DeepMind team to find a better solution.
The teams used data such as temperatures, power, pump speeds and setpoints, among others, from all of the existing data center sensors to train a set of deep neural networks. Google first trained neural networks on the average future PUE (Power Usage Effectiveness), which is the ratio of the total building energy usage to the IT energy usage. Then it trained two additional neural networks to predict the future temperature and pressure of the data center over the next hour. Using these neural networks, it can recommend a set of actions to ensure the optimal use of energy.
This DeepMind-powered solution led to a 40 percent reduction in energy used for cooling and a 15 percent reduction in PUE. This achievement is especially impressive because the data center that Google used for the experiments was already at its lowest historical PUE rating.
Google intends to use the general-purpose framework it created for its data center in other scenarios, including for improving power plant conversion efficiency, reducing semiconductor manufacturing energy and water usage, or helping manufacturing facilities increase throughput. Google should unveil more details in an upcoming publication.