EMC Supercharges Hadoop
February 26th, 2013 By: John Rath
At the RSA conference this week in San Francisco, EMC announced Pivotal HD, a new distribution of Apache Hadoop. Pivotal HD features native integration of EMC’s Greenplum massively parallel processing (MPP) database with Apache Hadoop.
The new EMC Greenplum-developed HAWQ technology brings ten years of large scale data management research and development to Hadoop and delivers more than 100X performance improvements when compared to existing SQL-like services on top of Hadoop. What makes Pivotal HD different is its ability to offer the full spectrum of the SQL interface and run reports without moving data between systems or using connectors that require users to store the data twice. It removes the complexity of using Hadoop, thus expanding the platform’s potential and productivity, and allowing customers to enjoy the benefits of the most cost-effective and flexible data processing platform ever developed.
Sam Grocott, vice president of marketing and product management, EMC Isilon, noted, “The introduction of Pivotal HD, combined with EMC Isilon’s native integration of the Hadoop Distributed File System (HDFS) protocol, continues the evolution of the industry’s first and only enterprise-proven Hadoop solution on a scale-out NAS architecture. This powerful combination succeeds in reducing the complexities traditionally associated with Hadoop deployments and allows enterprises to easily extract business value from unstructured data.”
Using the Greenplum MPP analytical database, Pivotal HD is a true SQL parallel database on top of the Hadoop Distributed File System (HDFS). HAWQ adds the capabilities of note include Dynamic Pipelining, a world-class query optimizer, horizontal scaling, SQL compliant, interactive query, deep analytics, and support for common Hadoop formats. HAWQ unlocks the potential of Hadoop’s fault-tolerant storage capabilities by bringing to bear the vast pool of “data worker” tools and languages into the Hadoop ecosystems.
“With Pivotal HD, we can check off many of the items on our Hadoop wish-list—things like plug-in support for the ecosystem of tools, improved data management and greater elasticity in terms of the storage and compute layer,” Steven Hirsch, chief data officer, SVP Global Data Services, NYSE Euronext. ”But above all, it provides true SQL query interfaces for data workers and tools—not a superficial implementation of the kind that’s so common today, but a native implementation that delivers the capability of real and true SQL processing and optimization. Having a single Hadoop infrastructure for Big Data investigation and analysis changes everything. Now add to all of this functionality the fact that the SQL performance is up to 100x faster than other offerings and you have an environment that we at NYSE Euronext are extremely excited about.”