Taking proper advantage of Big Data requires a holistic approach where organisations view their storage as a ‘data platform’ rather than a ‘data destination’, according to open source solutions provider, Red Hat.
As such, Red Hat said enterprise storage solutions must: deliver cost-effective scale and capacity; eliminate data migration and incorporate the ability to grow without bound; bridge legacy storage silos; provide global accessibility of data; protect and maintain the availability of data.
To lower cost, organisations must take a ‘scale-out’ approach for their Big Data storage platform. Red Hat said this will not only offers low costs today, but the ability to benefit from increased buying power as hardware gets better, faster, and cheaper over time. The company added that a Big Data storage system must also be scalable in the performance dimension so applications experience no degradation in performance as volume of data increases.
As enterprise data stores move towards petabyte sizes, wholesale data migration is no longer logistically or financially feasible, according to Red Hat. This means that periodic data migration needs to be addressed by providing a system with the ability to grow without bound.
In addition, by bridging legacy storage silos instead of adding another storage solution to the mix, organisations will be able to access and use all of their data without ad-hoc intervention.
Red Hat added that a centralised approach to data management is not possible in the age of Big Data as data volumes are too large, wide area network (WAN) bandwidth is too limited, and the consequences of a single point of failure are too costly. Instead, a Big Data storage platform must be able to manage data that is distributed across the global enterprise as a single, unified pool.
Finally, the platform must assume that hardware failure is inevitable and offer data availability and integrity through intelligent software.