Data-Centers leveraging AI for Efficiency

By CIOReview | Friday, June 15, 2018

The AI technology is going to play a key role in data-center operations because enterprises have started to adopt machine-learning technologies that have been implemented and tested by much larger data-center operators. The hybrid computing environments stretch across on-premise data centers, edge computing deployments, collocation sites and cloud. There is a tremendous potential in AI in streamlining the management of complicated computing facilities.

With machine learning, AI in the data center can track and automate the management of various facility components like power and power distribution elements, rack systems, and physical security. Data-center facilities have many sensors that gather data from devices like UPS, chillers, and switchgear. The data from these devices along with their environment is analyzed by machine-learning algorithms that provide insights into the performance and potential such as determining appropriate retorts that includes sending an alert or changing any setting. The machine learning system also learns from these changes and trained to self-adjust instead of relying on any particular programming instructions to perform its tasks. The main aim is to help data-center operators to increase the efficiency and reliability of the facilities and make them run more autonomously.

It was only within the last decade that the first data centers were fully implemented along with meters to monitor power and cooling. Data centers depend on building management systems that use multiple communication protocols like Modbus, Niagra, and LONworks and have to be satisfied with devices that do not share data or cannot be controlled with the help of a remote control. But data-center monitoring is greatly advancing as required for advanced analytics and machine learning.