Wikibon analyst says Hadoop pessimism is part of natural adoption cycle
Gartner Inc. caused a bit of a ruckus last week when it poured cold water on our hopes for Hadoop with a somewhat pessimistic report highlighting the low rate of adoption among enterprise users.
Gartner found that only one-quarter of the 284 technology and business leaders it surveyed were actually using Hadoop, with an even smaller subset operating their analytic clusters on any meaningful scale. That might not have come as a surprise, as previous surveys have shown similar low rates of adoption, but what did raise eyebrows was Gartner’s assertion that a mere 18 percent of respondents said they plan to try out or adopt Hadoop in the next few years.
But Gartner’s rather sombre outlook doesn’t mean that Hadoop has failed to live up to the hype. On the contrary, the figures are simply a reflection of what it takes for technologies as complex as Hadoop to mature to a point where they can be easily accepted by mainstream users, argues Wikibon’s Big Data and Analytics analyst George Gilbert.
The Gartner findings are “not all that surprising, and we’ve seen this happen before around 1999-2000 when there were lots of new software categories emerging,” Gilbert said. “Things were emerging so fast that there was no way customers could absorb it and put all that new software to work.”
One of Hadoop’s biggest problems is its high level of complexity, Gilbert explained. Hadoop, with its ecosystem of products that can be mixed and matched, gives users unparalled flexibility when working with data, but there’s always a trade-off. In the case of Hadoop, “the flexibility comes from all these different components, and when you combine them all, your custom solution has a lot more operational complexity,” Gilbert said.
Another issue that needs to be overcome is that many users still don’t really know what they’re trying to do with Hadoop, Gilbert said. They’ve built up a lot of data over the years and jumped on the Hadoop bandwagon without a clear plan of action.
“If a customer says they want to use Hadoop to access a data lake, they’re almost implicitly acknowledging that they don’t have a specific usage scenario where it’s solving a pressing business problem,” Gilbert said. “The result is it looks more like a data swamp than a data lake. “
“Part of the original beauty of having a data lake is that you can put a lot of data in there that’s not refined or curated,” Gilbert added. “But your users must understand that they’re responsible for adding the structure and organization, and a lot of people don’t realize that.”
“So much will change in 18 months”
The Gartner report cited the “skills shortage” as one of the main factors preventing widespread Hadoop adoption. Gilbert acknowledged that Hadoop’s complexity limits the size of the talent pool, but said the problem can be overcome as the platform evolves and becomes easier to implement.
“It’ll probably be a whole lot easier to fix that complexity by running Hadoop in the cloud instead of running it on-premise,” Gilbert said.
A case in point – just three weeks ago, Microsoft announced the upcoming release of SQL Server 2016, the latest version of its flagship relational database management system (RDBMS). The new release incorporates a new version of PolyBase, which serves as a bridge between SQL Server and Hadoop. By making it easier to map data stored in the Hadoop Distributed File System (HDFS) as external tables in SQL Server, Microsoft can open access by that huge ecosystem to Hadoop data using their existing skill sets.
But what about the lack of interest in even trying Hadoop that the Gartner research indicated? Gilbert said that interest will develop as people see the technology deliver on its promises.
“Mainstream customers in any new market are skeptical until they see others doing something useful,” Gilbert said. “In the absence of Hadoop’s widespread usage, they’re saying this is immature technology that isn’t ready yet.”
Gilbert also said adoption plans are nearly impossible to measure at the moment because Hadoop is still evolving. For example, cloud-based Hadoop services will address the skills shortage and lower that barrier to adoption. he predicted.
In the meantime, there’s likely to be a lull in adoption while the community works on simplification issues. “We might see a downturn in the growth rate for a while,” Gilbert admitted. “But we might also see increased usage among Hadoop’s most sophisticated users, the big Internet companies, big banks, telcos and retailers who have been getting good results with their experiments.”
Image credits: Coffy via Pixabay.com; jaci XIII via Compfight cc
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.