UPDATED 10:48 EDT / JUNE 13 2012

Analysis: Hortonworks Plays it Smart by Partnering for Mainstream Adoption

Much of the challenge for any early market player, particularly those in open-source software, is the ability to deliver solutions with broad appeal to mainstream customers in a timeframe that meets market demand. Hortonworks is doubling down on its Big Data market development and after just one-year of being in business is today showcasing its new Hadoop Data Platform (HDP 1.0) at its flagship HadoopSummit event which I covered here – <link>.

Hortonworks is playing it smart as it develops its portfolio of offerings around the Apache Hadoop project by leveraging relationships with some of the biggest ISV’s with deep pockets.  With names such as Microsoft, VMware and RedHat coming to the table, Hortonworks has cleverly found a way to shortcut appeal to the masses while remaining true to the credo of the open-source community around Hadoop development.

Hortonworks has been pretty open about the shortcomings of Hadoop for the enterprise – cluster implementation and management is difficult, high availability (HA) features are lacking and an everpresent skills gap makes it difficult to deploy a big data project on open-source platforms.  This is why established technology companies have garnered much of the enterprise attention (and revenue) in the nascent Big Data market such as EMC with Greenplum and HP Vertica. These companies have the services organizations, a deep understanding of business requirements and the resources to deliver solutions that make them the trusted partner for companies embarking on Big Data projects.  However the announcement today from Hortonworks  places the traditional technology suppliers on notice and puts them (and other open-source players such as Cloudera and 10gen) in the drivers seat for disrupting existing technology suppliers for the next generation of applications.

Hortonworks is already addressing some of the major must-haves for the enterprise with HDP 1.0.  By remaining true to open-source development and commodity hardware, costs are extremely competitive relative to proprietary solutions that leverage high end hardware and software.  Partnering with VMware to provide HA capabilities across important nodes in the Hadoop cluster is really, really smart. IT organizations have already bought into using VM mobility solutions as an answer for HA, so why build this when they can leverage existing technology and goodwill. And the relationship with Microsoft to accelerate the development of Hadoop on Azure is very important.  Microsoft with Azure is walking down the long path of eating their own dog-food in the enterprise software market and will continue to make deep investments in the Azure platform as it is the center of their cloud strategy and the lynchpin for remaining relevant in the next wave of IT applications. This joint relationship will undoubtedly aid in the development of analytics capabilities where data scientists can bridge the old-world RDBMS expertise while easily accessing data held in Hadoop file structure. The introduction of the metadata catalog based on Apache HCatalog in HDP 1.0 cements the notion that there is a real need to provide a bridge between structured and unstructured data to further big data analytics.

Let’s be clear, infrastructure and platforms are really important right now in Big Data because it is hard to do. But the end-game for the Big Data market is about analytics, building a new generation of applications that allows for advanced analytics of massive data set that holds untapped information with the potential to optimize business operations, reach new customers, revolutionize healthcare, change education models, and many more uses that we can’t even yet conceive.

SiliconANGLE.tv

Questions I will be asking on (watch live) on  SiliconANGLE.tv theCUBE live at Hadoop Summit – HadoopSummit.org

1. What are the new features and developments in HDFS ? How is more performant and robust ?

2. The state of enterprise versions of APIs ?

3. Looking for tools to slice and dice data in Hadoop and hBase

4. Will Hadoop will be used as primary Data Warehouse in many organizations ?

5. Future of Avro: Why was Avro created ? How does it differ from protocol buffers ?

6. HA namenode updates and evolution.

7. General questions around how real time and high availability meets the needs for diverse analytics for big clients


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU