Hitachi Data Systems Corporation (HDS) has released the next generation Hitachi Hyper Scale-Out platform 400 (HSP), that now offers native integration with the Pentaho Enterprise Platform.
HDS said the scale-out architecture provides enterprise-ready infrastructure to support big data blending, embedded business analytics and simplified data management.
The architecture includes a centralised user interface to automate the deployment and management of virtualised environments for open source big data frameworks including Apache Hadoop, Apache Spark and commercial open source stacks.
Hitachi Data Systems senior vice president of global portfolio and product management, Sean Moser, said the company consistently hears from enterprise customers that data silos and complexity are major pain points that grow worse in their scale-out and big data deployments.
“Our HSP appliance gives them a Cloud and Internet of Things (IoT) infrastructure for big data deployments and a pay-as-you-go model that scales with business growth. Integration with the Pentaho Platform will help them put their IoT data to work, faster.”
Pentaho chief technology officer, James Dixon, said, “The HSP-Pentaho appliance gives customers an affordable, enterprise-class option to unify all their disparate datasets and workloads via a modern, scalable, hyper-converged platform that eliminates complexity. The Hitachi Hyper Scale-Out Platform 400 is a great first-step in simplifying the entire analytic process.”