Moonshot is Great for HP’s Big Data Line Up, But Needs More Unification

On Monday Hewlett-Packard pulled the covers off Moonshot, a revolutionary server built for ultra-efficient hyperscale environments. Wikibon analyst Jeff Kelly discussed the debut of Moonshot on yesterday’s NewsDesk show with Kristin Feledy.

Kelly says that the solution HP unveiled this week is the product of a development effort the company first announced in November 2011.  The big deal about Moonshot is that it requires 89 percent less power and 80 percent less space than traditional boxes, and supports a wide range of processors from different vendors. This latter feature is an industry first that offers customers the ability to customize their infrastructure depending on their workload.

Kelly names Big Data as one example of a workload that requires a lot of low level optimization. One of the main reasons is that Hadoop clusters are scaled over time in response to increased usage, and factors like power efficiency and the physical footprint of servers become major considerations as the environment grows in size.

Moonshot is an important addition to Hewlett-Packard’s lineup, but it’s only one component of the vendor’s Big Data portfolio.  Kelly says that it also covers some of the company’s traditional x86 servers, software it gained through the acquisition of Vertica and Autonomy, and a number of integrated Hadoop appliances

The vendor has an impressive set of products, but Kelly believes that there’s still plenty of room for improvement. Specifically, HP could do more to unify its portfolio: Vertica’s real-time database could be bundled with Autonomy’s text analytics software, and Autonomy’s solutions can go a long way in enhancing Hadoop. Kelly estimates that it will take HP 3-4 years to come up with more consumable offerings.

Hewlett-Packard is currently the second largest Big Data vendor by revenue behind IBM, which is also in the process of consolidating its portfolio. EMC, another major player revamping the datacenter according to the demands of Big Data, but is taking a somewhat different strategy with the Pivotal Initiative.