George Mathew, the President and Chief Operating Office of Alteryx, shared his take on big data with Dave Vellante in a recent Cube interview.
Mathew’s first observation is that the whole analytics trend has come a very long way since his last appearance on theCube during SAP’s SAPphire 2010. Back then, he says, the community focused on fleshing out the infrastructure around Hadoop. The ecosystem managed to solve a lot of the key challenges in the span of two years, and now the emphasis is shifting towards extracting business value out of the technology. The term big data is used a lot more often, for one.
Mathew sees his company playing a central role in this transition. He described Alteryx’s offering as a fast, cost-efficient means of consumerizing analytics within the enterprise and make the data much more accessible to workers.
The executive also makes a point out of distinguishing between business intelligence and big data. BI, in his book, is data warehousing technology with ETL and dashboards that sit on top and convey information to the user – usually an analyst with a say on big business decisions. Big data, on the other hand, is the process of taking data from both sides of the firewall and packaging them for tens of thousands of workers. This data is real-time and predictive rather than historical, and one data analyst should be able to handle the entire task. More complexity is translated into slower response times, according to Mathew.
The executive expands on the technological angle. He says that big data is not a substitute for BI, but rather a very viable supplement. A big part of it is that CIOs don’t have to worry so much about the price tag – testing Hadoop and NoSQL internally requires very little resources in comparison to an enterprise’s existing infrastructure investments.
See the full interview below.