Newsletters




Symantec Partners with Hortonworks to Launch Enterprise Solution for Hadoop


Symantec Corp. has partnered with Hortonworks to introduce the new Symantec Enterprise Solution for Hadoop, providing a scalable data management solution for handling big data workloads. The add-on solution for Symantec’s Cluster File System enables customers to run big data analytics on their existing storage infrastructure.

With the new release, Don Angspatt, vice president of product management, Storage and Availability Management Group, Symantec Corp., tells 5 Minute Briefing, Symantec’s value proposition is “Don’t go out and build out a new analytics cluster. Leverage all the data and infrastructure that you have today and in many ways, make Hadoop more enterprise-ready.” By enabling integration of existing storage assets into the Hadoop processing framework, Symantec says, organizations can avoid time-consuming and costly data movement activities. Symantec Enterprise Solution for Hadoop allows administrators to leave the data where it resides and run analytics on it without having to extract, transform and load it to a separate cluster – avoiding expensive data migrations.

With Symantec Enterprise Solution for Hadoop, organizations can leverage their existing infrastructure - scaling up to 16PB of data including structured and unstructured data, avoid over-provisioning on both storage and compute capacity, run analytics wherever the data sits, and make Hadoop highly available without a potential single point of failure or a performance bottleneck. Symantec Enterprise Solution for Hadoop provides file system high availability to the metadata server while also ensuring analytics applications continue to run as long as there is at least one working node in the cluster. Since the Hadoop file system is replaced with Symantec’s Cluster File System, each node in the cluster can also access data simultaneously, eliminating both the performance bottleneck and single point of failure.

“Hortonworks will provide support for this solution with the Hadoop connector between Cluster File System and the Hadoop stack,” Angspatt explains. “Anyone can download a free copy of HDFS and start running their big data analytics, but it is probably not the most common pathway. Folks are relying on companies such as Hortonworks or Cloudera to provide the support, the infrastructure, the services required with deploying Hadoop and so the partnership with Hortonworks helps us in many ways gain the trust and credibility in the analytics space.”

In many ways, Hadoop was designed for specific use cases around data analytics but not necessarily with the enterprise-class features that organizations are coming to expect – things like HA, seamless data migration or data mvoes, and the concept of efficiency of compute utilization, Angspatt says. Symantec’s Enterprise Solution for Hadoop helps connect Hadoop’s business analytics to the existing storage environment while addressing key challenges of server sprawl and high availability for critical applications, making it possible for customers to get the big data solution they need from the infrastructure they already have. “Symantec’s Enterprise Solution for Hadoop is an enterprise-grade solution for big data that allows you to run your Hadoop stack on top of our Cluster File System where you are leveraging all of the data management capabilities of Cluster File System – the robust high availability capabilities, data management, and at the same time, leveraging your existing data that may exist on SAN storage.”

The Symantec Enterprise Solution for Hadoop is available now to existing Cluster File System customers at no additional charge. Symantec Enterprise Solution for Hadoop supports Hortonworks Data Platform (HDP) 1.0 and Apache Hadoop 1.0.2. Customers running HDP 1.0 will be able to get Hadoop support and training from Hortonworks.

For more information, go to www.symantec.com/enterprise-solution-for-hadoop.


Sponsors