Data center is a facility that centralizes computer servers, storage and networking systems for certain IT needs. As the world’s economy shifts further into the digital era, the demand for data centers keep accelerating at a fast pace. In 2006, data centers in the U.S. consumed about 61 billion kilowatt-hours (kWh), which is roughly 1.5% of total U.S. electricity consumption. The energy used by data centers was more than doubled between 2000 and 2006.  If the energy-efficiency of data centers did not improve, it was predicted that by 2011 the energy consumption of data centers would exceed 100 billion kWh. That would have significant impact on the power grid and power generation as well as increase millions of tons of greenhouse gas emission . The need for more efficient data centers is urgent and it calls for innovative solution.
Fortunately during the past decade all the major companies of data centers worked together to improve the efficiency and in 2014, data centers in the U.S. consumed an estimated of 70 kWh, at growth rate much lower than originally forecasted. Some of successful strategies and best practises include:
- Consolidate smaller server operations into hyperscale data centers
- Adopt new energy-efficient servers and components
- Apply best cooling practices: liquid cooling, hot air containment, extensive monitoring and free cooling during cold season
As most data centers adopted these best practices, the improvement to energy efficiency starts to show diminishing return and the energy use is expected to slightly increase in the near future, at rate of 4% growth from 2014-2020. However according to the Uptime Institute’s 2014 Data Center Survey , the global average Power Usage Effectiveness (PUE) of the largest data centers is around 1.7, which means every 1kWh of energy used on IT equipment, 700Wh is consumed as facility overhead.
Definition of Power Usage Effectiveness (PUE)
Google, the company with the largest data centers in the world, has always been focused on improving energy efficiency while sustaining the explosive growth of the Internet. Since it started reporting PUE publicly from 2008, the average PUE across the whole fleets continued to decrease and stabilized around 1.12.
History PUE values at Google 
As the modern data center becomes more complicated with a wide variety of mechanical and electrical equipments, further reducing PUE is found extremely difficult due to following reasons:
- Traditional formula-based engineering could not capture the complex, nonlinear interactions among various of equipments and their settings.
- Although every day millions of data points are monitored by thousands of sensors installed in the data centers, analyzing such data is found beyond simple intuition and heuristics.
- Rapid changes in internal requirements (usage/traffic spike) and external conditions (weather change) make quick adaptation even more challenging for human operator.
- As each data center has different design and environment, a working model for one data center does not transfer to another one.
Hence, a general intelligence framework is necessary to understand the data center’s interactions and to provide optimal management of various equipments. Based on its success in DeepMind’s AI algorithm, Google trained a system of neural networks with different operating scenarios and parameters within its data centers. About 120 variables in the data centers (such as the fans and pump speeds, temperature and system efficiency) are feeded into the neural networks to allow the AI to understand the dynamics of data centers and work out the most efficient methods of cooling. 
PUE comparison during a typical day of testing, between AI recommendations on and off 
The result of the AI system is impressive. It is able to reduce the amount of energy used for cooling by 40%, which is equal to 15% of overall PUE overhead.
The potential impact of such intelligent system is enormous for data centers. Since most non-google data centers have significant higher degree of PUE overhead, the room for improvement is even bigger. If adopting such intelligent system can reduce the average PUE from 1.7 to 1.6, a similar 15% cut in the above scenario, 4 billion kWh of energy can be saved each year or 2.8 million tons of greenhouse gas can be eliminated. If all the non-google data centers can reach Google’s PUE of 1.12, 24 billion kWh of energy or 16.8 million tons of greenhouse gas can be eliminated.
While many companies have made big strides in increasing data center efficiency, there are still significant opportunities for further improvement. The recent attempt by Google to utilize advanced technology like DeepMind’s to optimize data center operations showed impressive result in reducing energy consumption and demonstrated a great potential on how AI can help human to optimize the system and address the urgent challenge of climate change.
Word count: 770
 United States Data Center Energy Usage Report, Ernest Orlando Lawrence Berkeley National Laboratory
 Based on the estimated increase of 30 billion kWh and Greenhouse Gas Equivalencies Calculator
 2014 Data Center Industry Survey, UpTime Institute
 Efficiency: How we do it, Google
 Machine Learning Applications for Data Center Optimization, Jim Gao, Google
 DeepMind AI Reduces Google Data Centre Cooling Bill by 40%, Google