Pentaho’s Quentin Gallivan predicts Big Data will get to work in 2016
Data blending, the process that delivers a quick and straightforward method to extract value from multiple data sources, has continued to grow this year but what lies for the year ahead?
Pentaho Corporation Chief Executive Officer Quentin Gallivan predicts that data blending will be bigger than ever next year, along with the Internet of Things (IoT) getting real and Big Data continuing to move to the cloud in the latest in SiliconANGLE’s 2016 predictions series.
Gallivan’s Big Data predictions as follow.
Big Data gets to work
Big Data will be put to work and data blending will be bigger and more important than we could ever have imagined back in 2010 when we were just getting started with Hadoop, Gallivan notes for the year ahead.
“As business and society confronts large-scale and systemic security, environmental, health and economic problems and opportunities, a platform that blends unstructured and relational data is turning out to be an indispensable ‘multi-tool’. As one major telecommunications customer blends data to halt potentially catastrophic cybersecurity breaches, a retailing customer is applying it to raise revenue and loyalty by personalizing in-store recommendations.”
“In 2016, demands will continue to grow in sophistication. For instance, consumers will increasingly expect retailers to deliver highly customized recommendations that predict what they want at the right time through the right device and followed through with seamless and secure e-commerce transactions.”
“Data blending’s potential in areas from automotive telemetry to medical science to national security is enormous, and we’re only at the beginning.”
IoT gets real
McKinsey Institute estimates in its “Unlocking the potential of the Internet of Things” report in June that IoT will create anywhere from $3.9 trillion to $11.1 trillion in economic value by 2025.
Gallivan believes that the key use cases driving this transformation include Predictive Maintenance, Smart Cities, and Smart Homes.
“Large industrial companies not viewed as part of the Internet economy are hitting back hard in transforming their companies around the Industrial Internet. These companies make, move, sell and support physical things and these sensors are getting plugged into the internet. Introducing these physical things into the internet presents myriad new challenges and opportunities in the areas of data governance, standards, health and safety, security and supply chain – to name but a few. Companies must start planning now or risk being left behind.”
Next-Gen Analytics will be embedded and delivered at the point of impact
Ventana Research found in a survey of over 250 top IT executives that analytics was the most critical skill for the future as a strategic enabler that will help companies grow revenues, reduce costs and make companies more productive.
“While the classic Business Intelligence model of a data analyst using a tool outside the application flow to analyze historical data is viewed as somewhat helpful,” Gallivan says, “the next generation of analytics, where business users consume analytics as part of the business application will be viewed as mission-critical.”
“Real-time analytics, blending both unstructured and structured data, deployed within the application will be critical to delivering insight for; 360 view of the customer, IOT, Supply Chain management, Security and Fraud prevention use cases.”
The cool tools are getting ready for prime time
The explosion of cool new open source tools designed to enable large-scale, high-volume analytics on petabytes of data such as Spark, Docker, Kafka, and Solr are moving from the “awkward teenager” phase to the “bearded hipster” phase according to Gallivan.
“These tools are critical to driving faster innovation and are becoming more mature as we speak. The in-memory processing framework of Spark which provides the key to machine learning and real-time data streaming on large data sets is being led by big data pioneers like Databricks and Cloudera, as well as embraced by large technology companies like IBM.”
Cloud as the preferred deployment model for Big Data
Gallivan believes that the cloud is emerging as a preferred deployment model for big data.
“At our last Strategic Advisory Board session almost all members had either deployed their big data applications in the cloud, or were planning to. This includes the stock exchange regulator FINRA, which operates in a highly secure environment, handling and looking for anomalies in a whopping 75 billion transactions every day.”
“Companies want the option to deploy Big Data applications either behind the firewall, 100 percent in the cloud or a hybrid, private/public cloud environment.”
Investment in UX will grow within the Big Data analytics ecosystem
User experience (UX) will be a major investment area in the Big Data analytics ecosystem, starting with cloud but going further into building better user experiences for streaming and predictive analytics, Gallivan predicts.
“It’s about integrating new, secure technologies like facial recognition to reassure users that their data is safe and trustworthy. Moving towards real-time speed is also a key part of the user experience. Our customers tell us that Hadoop is still too hard, and are hoping that the vendors invest in making the experience easier. The easier we all make it, the more use cases we unlock for the enterprise.”
Image credit: SilconANGLE/theCUBE.
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU