8 Data Center Trends to Watch Out For in 2016
A data center is a centralized storehouse of data, that can be virtual or physical, and is used for the purpose of storage, management, and propagation of information systematized around data pertaining to particular businesses. Contemporary enterprises in 2016 conduct their business over the internet and data centers provide the necessary regulation required to help businesses run efficiently. Most, if not all the information running through a company, specifically involving an electronic medium, pass through some sort of data center or the other, either on-premise or as a separate specialized facility. As such, CEOs, CIOs, and CTOs everywhere are wising up towards having more equipped data centers in order to negate the heavy upturn in online related activities.
The confluence of disruptive technologies like cloud computing, virtualization, mobile computing, and big data into data center management in recent times, has led to reinvention of data center strategies. Upper management teams of present-day enterprises now have to device enhanced infrastructure plans that are better equipped to deal with the ubiquitous adoption of IoT across channels. They have to mull over various important decisions regarding increased density, its impact on cooling, power, and space as well as satisfactory security measures.
Businesses have to adapt to increase in computing power required to seamlessly facilitate operations of daily business adherences. At the same time, one cannot simply ignore the requirement of enterprises to be highly resource efficient and concurrently, cutting down cost. As a result of such challenges we envisage these trends will emerge in 2016:
Reconfiguring Power Requirements
Electricity is often rendered to be an unsung hero in the computing realm, but effectively a server without electricity is just a heap of metal parts, isn’t it? Data centers don’t just power the servers, there are other hefty taxations of electrical requirements for the foundation of a data center viz. lighting, cooling, UPS, fire suppression, alarm systems, etc. Data centers traditionally harbored energy requirements of 4 to 5 kW per rack but that has been amped up to 8 to 12 kW per rack or more today. Modern-day data centers utilize more kilowatts (kW) of power per rack or per square foot than ever before, owing to the advent of new technologies like the hyper-converged infrastructure, micro services and containers that stipulate more energy requisites.
New temperature and humidity guidelines have helped shape new configurations and designs culminating in more data centers designed with a modular approach that averts over sizing. Data centers with increased density and greater power in each rack are a growing trend that allows businesses to get the most out of their servers.
Big businesses and the government both have grown stringent with regulations and increasing scrutiny, especially for banking and finance sectors. As such, compliance with Statement on Standards for Attestation Engagements (SSAE) No. 16, the ANSI/TIA-942 data center infrastructure standard, Sarbanes-Oxley, HIPAA (Health Information Security Rule Safeguard Standards) and the PCI-DSS (Payment Card Industry Data Security Standard) are important criterion to have on your checklist.
Protecting the data in storage subsystems and network traffic from malware, hacking, and data leakage are high priority, while offering layers of physical security on-site with restricted personnel entry is also an overlooked aspect of data center security and can often be detrimental for the organization concerned.
Smart Technologies, Open Standards, IoT and Hyperconvergence
A recent study by Gartner revealed that by the year 2020, over 25 billion devices will be connected to the web and that shall push the demand for storage within data center immeasurably. By incorporating smart technologies, powered by IoT into the data center, facility managers can monitor the status of components in real-time and environmental measurements to keep operations flowing efficiently. Hyperconverged systems that bring together key IT system components from various headers into one central system that can be monitored through a software layer will take on a larger role within data center infrastructures.
An innovative technology like the The Open Compute Project (OCP), developed by Facebook has found itself deployed in hyperscale data centers in order to reduce operational costs. Even if there are more than one single open standard developed in the future, open standards for networking and storage are a growing trend in the sector.
Comprehensive metrics are required to improve the efficiency of the data center, as most businesses toady are focused only on simple metrics, such as Power Usage Effectiveness (PUE), but simply put; only PUE is not enough. Facility management teams need to monitor much more than electricity, and keep track of measurements of water usage and the amount of heat being directed from the hot aisle to the workspace.
Data centers in the coming years will face growing pressure to find workable ways to integrate renewable energy sources into projects. The growth of corporate social responsibility programs focusing on carbon neutrality, coupled with the potential for federal legislation in the U.S., placing caps on carbon emissions are currently driving the “go green” initiative across data centers in the country.
Solar-powered solutions put in place in their respective data centers by companies like Emerson Network Power, AISO, and i/o Data Centers, show how serious the industry is about the notion of protecting the environment. Wind-powered data centers and geothermal data centers are also increasing in numbers, although their successful implementation is dependent on their geographical locations to a point. Another innovative way that data centers are taking on the being in sync with the ‘environmentally friendly’ mantle is by using a process called waste heat reclamation—most data centers today already generate ample heat emerging from the back of their server racks, and this can be turned into an energy source using this process.
Big Players like Google, Facebook and Amazon have already invested heavily in architecture of devising highly responsive IT environments that can easily inflate and shrink as and when mandated by business requirements. Building block scalability remains a trend for the future as infrastructures can now enable building-block scaling with the equivalent amounts of redundancy and resiliency using software-defined approaches. Also, physical facilities are now being designed for a higher-density environment that can make special accommodations to support a high-density equipment load.
Liquid Cooling Functionalities to Bear Greater Consequences
High performance computing functionalities are set to affected pervasively as by using new liquid cooling technologies, one can place over 250 kW in a single rack. Thus, liquid immersion cooling is all set to play an important role for such systems.
Liquid cooling can be deployed at specific areas, precised to its row and rack, thus making it highly flexible. It is also quieter than traditional methodologies and more reliable, and as such its implementation across data centers are set to grow in the near future.
DCIM and Autonomous Systems
Data Center Infrastructure Management (DCIM) when acquired from the correct vendor can optimize the performance, efficiency, as well as business value of IT physical infrastructure and keep it effortlessly aligned with business needs. The physical layer is now being treated with the same priority level as the logical layer. Significant investments towards monitoring the logical layer are shifting to investments towards managing the physical layer. DCIM functionalities are predicted to bear a new level of intelligence and refined automation, as automation of data center management activities will become mandatory in order to reduce the workloads and human errors in the near future.