UPDATED 17:20 EDT / JANUARY 21 2021

CLOUD

It’s time for a data architecture do-over

As Moore’s Law becomes less relevant, technological innovation is entering a new age. Instead of ever tinier and more powerful computing hardware, the next wave of value creation will come from the mixing of big data, machine learning/artificial intelligence, and cloud.

“The coming decade will be one where customers start to unlock outsize value out of [cloud]” said John “JG” Chirapurath (pictured), vice president of Azure data, artificial intelligence and edge at Microsoft. “With the amount of data that people are moving to the cloud, you’re going to see the use of analytics, AI for business outcomes explode.”

Chirapurath spoke with Dave Vellante, host of theCUBE, SiliconANGLE Media’s livestreaming studio, during theCUBE on Cloud event. They discussed how the application of AI/ML to cloud data will affect technological innovation in the coming decade.

One way to land the data, multiple ways to use it

Today’s marketplace offers a lot of either/or options when it comes to data architecture, which can cause confusion in its complexity. The convolutions in the data pipeline can cause companies to struggle in gaining value from data as app builders curse the delay between hatching an idea and monetizing it.

“What worked in the past will not work going forward,” Chirapurath stated. He sees the biggest push for change in the areas of analytics and AI. Enterprise analytics today is a mix of relational systems, Hadoop systems, data marts, and “large honking” enterprise data warehouse databases built up over decades, he explained. And when companies start to modernize, they don’t want to migrate all the complexity into the cloud as-is.

“What they really want is a completely different way of looking at things,” Chirapurath said.

To solve this, the solutions of the future need to be converged, according to Chirapurath. “It isn’t about having 50 different services. It’s really about having that one comprehensive service that is converged,” he said. This is accomplished by introducing a layer that abstracts away the complexity of the underlying technology, giving the ability to land any kind of data in the data lake, and using any kind of compute engine on top to drive insights from that data.

Whether it is hydrating a relational data warehouse, performing ad hoc analytics, invoking intelligence on that data, or even bringing in a machine learning model on the prepped data, “you can do so,” according to Chirapurath. “Inherently, when customers buy into this proposition, what it solves for them and what it gives to them is complete simplicity. One way to land the data — multiple ways to use it. And it’s all regulated.”

Practical use cases for global data mesh

With this concept “a data warehouse or a data lake could just be a node inside of a global data mesh” with an abstraction layer managing the underlying technology, Vellante pointed out.

The idea of upending the current data architecture model by creating a global data mesh has been in the news lately, thanks to initial public offering buster Snowflake Inc. One of the foremost thinkers on the subject is data architect Zhamak Dehghani, who also spoke with Dave Vellante during theCUBE on Cloud event. Dehghani’s premise is that data product and service builders are “frustrated because the big data system is generic to context. There’s no context in there,” Vellante recalled. By having context in the big data architecture and system, you can get products to market much faster.

Customers are already using Microsoft’s Azure Synapse Analytics in this way, according to Chirapurath. “It’s not where they start,” he said. “Oftentimes a customer comes and says, ‘Look, I’ve got an enterprise data warehouse; I want to migrate it.’ Or, ‘I have a Hadoop system; I want to migrate it.’ But from there, the evolution is absolutely interesting to see.”

Chirapurath gives the example of the Peace Parks Foundation, which monitors the endangered wild rhino in a 100,000-square-kilometer conservancy area in Southern Africa. Gathering data from connected devices across the park, the foundation analyzes it using machine learning to identify poachers and rhinos at risk. So instead of patrolling at random, rangers can scramble strategically when and where they’re needed.

“The importance is really getting your data in order, landing consistently whatever the kind of data it is, build the right pipelines, and then the possibilities of transformation are just endless,” Chirapurath sai.

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of theCUBE on Cloud event:

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU