

Ten years into the Hadoop-driven big data “revolution,” big data tools and platforms are widely available. But, writes Wikibon Big Data and Analytics Analyst George Gilbert, “the path from procurement to business value remains complex and labor intensive.”
Gilbert blames this on the community’s intensive focus on tools rather than applications. This is a typical cycle in the IT industry in which “the tools and platforms that work for one class of applications go through a process of ‘adaptive stretch,’ beyond which they become unproductive for new applications,” he writes (see figure above). At that point a new set of tools and platforms emerges to empower the new class of applications.
However, Gilberts sees hope as big data decision-makers focus on new questions, including:
One symptom of the ‘adaptive stretch’ of Hadoop is the increasing cost of integration. Cloudera Inc. told Wikibon that it budgets 50 percent of its engineering effort for each new component to integration. That leaves only 50 percent for adding functionality. This complexity explains why customers need five different administrators with specialized skills to operate a 10-server pilot. Another symptom is that most large Hadoop production deployments are in the tech industry, including ad tech, rather than in more traditional enterprises.
The full article, available to Wikibon Premium subscribers, discusses the issues in more depth and provides guidance for corporations struggling with big data tool complexity.
THANK YOU