Intel’s Hadoop Play is about Security, Standardization [Exclusive Interview]

Intel is not your typical surveyor of open source. As a proprietary juggernaut, Intel coming into the Hadoop distribution community further validates open source as the marketplace. Boyd Davis, VP Marketing at Intel, sat down with theCube hosts John Furrier and Dave Vellante last night to discuss Intel’s big contribution plans for the Hadoop community, amongst other things.  The full video is below.

Intel announced at Strata its Intel Distribution for Apache Hadoop software (Intel Distribution). The offering, which includes Intel Manager for Apache Hadoop software, is built from the silicon chip up, to deliver industry-leading performance and improved security features.

Davis said that Intel’s main focus is on ease of deployment in Hadoop, and open source has a three-level framework:

  1. Interactivity and responsiveness
  2. Security — encryption into the file systems (Rhino)
  3. Reliability and enterprise quality of the stability of the platform

“We’re building the silicon into Hadoop, rather than building Hadoop into the silicon,” said Davis. He further elaborated that clients want a more granular level of security, and customers are very much clamoring for that now. Furthermore, he said INTEL is seeing a lot of clients demanding better security for multi-dependency data storage.

The client feedback Intel is gathering espouses a need for Bid Data latency issues to be more secure. “We are absolutely planning on building a service offering around package applications and making big data more accesible and more secure,” Davis says.  His company aims to drive for standardization and integration wherever possible on Big Data.

Davis identified intelligence devices (smartphones, tablets, etc.) at the edge of the network as posing the biggest data safety risk. Securing data across the entire network, even reaching the edge, is going to take an integrated solution of hardware and software.

Furrier asked Davis what his to-do list in the next 6-12 months was. Data center scope, exposing APIs from their legacy items, data center optimization, and investing in the long term success of Hadoop fits into those plans. Big Data is going to be a mix of old and new tools, and Intel wants to create a platform. The goal is that its partners like SAP, SAAS, and Cisco can bring great infrastructure to the stack alongside Intel’s baked-in offerings.

We’ll have more great interviews from Strata throughout the week.  Check our live broadcast here, and on our YouTube channel is where you’ll find all of our archived segments from Strata and other great events in Big Data and the Cloud.