What Trends Data Center Heading Towards in 2016?
FREMONT, CA: The Data center industry continues to advance at an unprecedented pace. While many businesses remain cautious about making significant changes to their IT infrastructure due to economic uncertainty, there are prominent signs of innovation on many fronts and the potential for major transformation in the Data center industry. Keeping this in mind, Henrique Cecci, Research Director at Gartner, has outlined few emerging trends that will affect data center facilities in the coming year, reports Robert Gates for TechTarget.
The Data center trends to look out for in 2016 will be:
Rise in Power Requirements
With Data centers using more kilowatts (kW) per racks or per square foot than ever before, the need for more energy is going to rise up. To support the increasing power consumption demands, data centers will be opting for new configurations and designs.
Due to more movement towards Big Data and data analytics and some of the other new technologies such as converged infrastructure and integrated systems, data centers are starting to run out of power. As convergence becomes a growing trend there will be a need for converged infrastructure management platforms that can provide an integrated unified view to bridge the gap between virtual networks and physical infrastructures. In this respect, virtualization will become a driving force for hyperconvergence as it exposes the inefficiencies of SAN storage and the need to virtualize the storage and network layers.
According to Tim Flynn, Michael Giess, Steve Harris, Thomas McKinney, Kevin Vesely, authors for Forsythe Solutions Group, most cabinets in low-density data centers are built to handle only 4 to 5kW of critical powerWith today’s converged infrastructures, one IT device alone can consume the entirepower easily, leaving majority of the cabinets empty. As a result, more IT equipments will not be accommodated which in turn will cause huge waste of financial resources, reports Tim Flynn, Michael Giess, Steve Harris, Thomas McKinney, Kevin Vesely for Forsythe Solutions Group.
Need for Greater Security
The advancement of IoT, cloud computing and software-defined infrastructure is all increasing concern about security, says Cecci. To raise the bar of regulations and increasing scrutiny, both industry and government are undertaking initiatives, especially for banking and finance businesses.
The move towards multi-tenant and hosted data center facilities is creating a need for greater security as sharing physical and network infrastructure with other tenants might put data at risk. Moving to a colocation data center with strong physical security can not only prevent unauthorized access but also keep data in compliance with industry’s regulations. To polish their physical, building infrastructure and logical security, collocation data centers are adding more cameras and physical security checkpoints.
As organizations are not wanting to deal with outdated facilities and building systems infrastructure, they are turning more to the cloud and data center colocation providers. Colocation facility makes it possible for enterprises to adopt a hybrid data center model to run legacy, private cloud and public cloud infrastructures.
Greater Impact of IoT
The Internet of Things (IoT) has a potential transformational effect on the data center market, its customers, technology providers, technologies, and sales and marketing models, according to Gartner. It estimates that the IoT will include 26 billion units installed by 2020, and such deployments will generate large quantities of data that need to be processed and analyzed in real time, increasing workload of data centers. IoT will increasingly become a part of operations within the data center for things such as asset management using sensors to monitor temperature and physical security.
Data Center Infrastructure Management (DCIM)
As a result of increased data storage pressure, data center operations and providers will need to deploy more forward-looking capacity management platforms that can include a data center infrastructure management (DCIM) system approach of aligning IT and operational technology (OT) standards and communications protocols to process the IoT data points based on the priorities and the business needs.
Open Standards for Data Center
By establishing standards that are open and vendor-agnostic, systems and software manufacturers will have a clearer roadmap to develop open data center products with more predictable features and functionality.
The Open compute Project (OCP) was founded by Facebook to share the learnings of building its own data centers. It develops and shares designs for compute, storage and general DCI, relating to servers, chassis and racks, as well as their power and cooling. It seeks to offer energy-efficient and low-cost open-source solutions for both hardware and software. In the coming years, OCP adoption by enterprise data centers will grow to reduce operational cost. Open standards for networking and storage will also change the framework of data center facilities in the future, says Cecci.
Increased Use of Metrics
Gartner states that to improve the efficiency of data centers, enterprises depending on Power usage effectiveness (PUE) metrics is not going to work. Enterprises need to use better metrics to not only measure electricity but expand to measuring water use and how much heat is being directed from the hot aisle to office areas.
More Investment in Renewable Energy Projects
Environmental issues have even alarmed big names like Facebook and Google to run their data centers on renewable energy. Today most hyperscale data centers are committed in taking green initiatives by making more investments in renewable energy projects in order to meet the compliance requirements.
For instance, Facebook is going to set up a new data center in Fort Worth, Texas, that will be powered entirely by renewable energy. For sustainability, the Fort Worth data center will be cooled using outdoor air rather than energy-intensive air conditioners, reports Katherine Noyes for Computerworld.
Opportunities for Liquid Cooling
Developed by IBM to cool mainframes, the use of direct liquid cooling in data centers took a back seat with the invention of metal oxide semiconductors (CMOS). As the amount of data centers with cooling requirements close to those of high performance computing is growing, liquid coolant is on the verge of making a comeback, according to a recent report by 451 Research.
Companies like Facebook and Google have “legitimized a less conservative approach to the design and operation of facilities and paved the way for use of technologies such as [direct liquid cooling] by other operators,” says Andrew Donoghue, European research manager and the report’s author, 451 Research.