U.S. Data Centers Could Consume 140Bn Kilowatt-Hours of Electricity by 2020
FREMONT, CA: According to a recent report from Natural Resources Defense Council (NRDC), the electricity consumption of U.S. data centers is expected to reach 140Bn kilowatt-hours by 2020. While, for the year 2013, 91Bn kilowatt-hours of electricity consumption was observed.
The estimation of 140Bn kilowatt-hours of electricity consumption by 2020 would be equivalent to the annual output of 50 power plants that would cost $13Bn annually in electricity bills. The dreadful aspect of this massive consumption of power is that it may emit nearly 100Mn metric tons of carbon pollution every year.
Dearth in efficiency in the electricity consumption of data centers comes from the small, medium and corporate data centers while larger server farms managed by the top internet brands have adequate methodologies in place for efficient electricity consumption for their data centers. The maximum power consumption takes place not in the hyper-scale cloud computing companies providing internet services to consumers and business, but it is the small, medium, and large corporate data centers as well as in the multi-tenant data centers who consume vast amount of energy, as reported by Pierre Delforge.
Lack of metrics and transparency, and misalignment of incentives are the barriers that prevent these energy-wise less-efficient data centers to be at par with hyper-scale data centers.
Best practices to increase data center efficiency in terms of energy consumption are:
Adoption of a simple server utilization metric – to check under-utilization of CPU; alignment of incentives between decision-makers – best practices for energy savings should incentivize for the stakeholders rather than stand in the way of energy savings; disclosure of data center energy and carbon performance - public disclosure is a tool for demonstrating leadership and driving behavior change across an entire sector.