Hitachi Data Systems Unveils its Next-Generation Hyper-Converged Platform

By CIOReview | Tuesday, March 22, 2016

SANTA CLARA, CA: Hitachi Data Systems Corporation (HDS), has recently announced the release of its next generation Hitachi Hyper Scale-Out Platform (HSP). The HSP offers native integration with the Pentaho Enterprise Platform to deliver a sophisticated, software-defined, hyper-converged platform for big data deployments. The HSP 400 series is supposed to combine compute, storage and virtualization capabilities, and proffer seamless infrastructure to support big data blending, embedded business analytics and simplified data management.

The latest offering from HDS, HSP delivers a software-defined architecture to centralize and support easy storing and processing of large datasets with high availability, simplified management and a pay-as-you-grow model. The fully configured turnkey appliance can be installed within hours and supports production workloads, and simplifies creation of an elastic data lake, ultimately aiding customers to easily integrate disparate datasets and run advanced analytic workloads.

"Many enterprises don't possess the internal expertise to perform big data analytics at scale with complex data sources in production environments. Most want to avoid the pitfalls of experimentation with still-nascent technologies, seeking a clear path to deriving real value from their data without the risk and complexity," said Nik Rouda, Senior Analyst at Enterprise Strategy Group (ESG). "Enterprise customers stand to benefit from turnkey systems like the Hitachi Hyper Scale-Out Platform, which address primary adoption barriers to big data deployments by delivering faster time to insight and value, accelerating the path to digital transformation."

HSP's scale-out architecture provides simplified, scalable and enterprise-ready infrastructure for big data. Further, the next-generation HSP system offers native integration with Pentaho Enterprise Platform bestowing consumers with complete control of the analytic data pipeline and enterprise-grade features such as big data lineage, lifecycle management and enhanced information security.

With HSP, Hitachi continues to deliver on the promise of the software-defined datacenter to simplify the delivery of IT services through greater abstraction of infrastructure, and improved data access and automation.

"We consistently hear from our enterprise customers that data silos and complexity are major pain points—and this only gets worse in their scale-out and big data deployments. We have solved these problems for our customers for years, but we are now applying that expertise in a new architecture with Hitachi Hyper Scale-Out Platform," said Sean Moser, senior vice president, global portfolio and product management at Hitachi Data Systems. "Our HSP appliance gives them a cloud and IoT-ready infrastructure for big data deployments, and a pay-as-you-go model that scales with business growth. Seamless integration with the Pentaho Platform will help them put their IT and OT data to work—faster. This is only the first of many synergistic solutions you can expect to see from Hitachi and Pentaho. Together, we are making it easy for our enterprise customers to maximize the value of their IT and OT investments and accelerate their path to digital transformation."