UPDATED 14:34 EST / JUNE 29 2023

CLOUD

Google Cloud’s BigQuery aims to move beyond the data warehouse

Cloud computing is rapidly expanding, with next-gen action on the horizon, app developers building data apps and infrastructure emerging. Meanwhile, everyone is talking about next-level generative artificial intelligence and the role of data in that space.

Enter Google Cloud and its new functionality for BigQuery, a serverless enterprise data warehouse product. Every journey in generative AI starts with data, which has prompted Google Cloud Platform to focus on moving beyond the data warehouse, according to Bruno Aziza (pictured), head of data and analytics at Google Cloud.

“The way we design BigQuery is way more than just a data warehouse. It’s an analytics system,” he said. “It’s what people want. What do I mean by that? It’s a system that handles any data, at any speed, that has embedded machine learning as a key principle.”

Aziza spoke with industry analyst John Furrier, during an exclusive interview with theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed how embedded business intelligence functions as a key principle and the various capabilities recently launched for BigQuery. (* Disclosure below.)

Scale for next-generation applications

That embedded business intelligence supports any type of data — structured, semi-structured, unstructured — within the same environment. At the same time, it remains open to other data platforms.

“In our case, we have BigQuery Omni, which allows you, from BigQuery, to query data that’s in Amazon, that’s in Azure, and we have this amazing data-sharing platform,” Aziza said. “Any week, over 6,000 organizations securely share about 275 petabytes of data. This is the type of scale that customers need in order to build their next-generation applications.”

When it comes to data, people are bringing their own data models to the table in a new way. That often raises three themes: democratization of workloads, secure data and choice. BigQuery offered general availability in 2011; since then, the company has been working with gigantic companies around the globe that many may not have heard of, according to Aziza.

“You know, Tokopedia is an e-commerce giant, and using our technology, they’re able to cut analytics computing costs by 25%,” he said. “This is a company that has an online marketplace connecting 10 million merchants, a hundred million customers every month around products, selection, payment, delivery.”

New value propositions around data

In the future, applications are going to be built wanting to access all kinds of data, with a new value proposition emerging where people are realizing that data in motion is valuable — but companies don’t want to make it in motion and put it away for free into public data sets.

There’s a goal within Google Cloud to make it as easy as possible for people to onboard on the platform and innovate with it, and innovations have been designed around that idea, according to Aziza.

“This idea of having BigQuery additions is about making really easy for you as a customer to assign the right version of BigQuery, with its associated capabilities, to the workload that you’re interested in running. And we’ll do the rest,” he said. “Autoscaling is a foundational capability of BigQuery additions. And what it does, very put simply, is that it follows your usage at the second level and it charges you only for that.”

If one were to think about differentiation as a data team building data applications, the goal would be to think about these features as “just forget about it” features — meaning Google Cloud will take care of the infrastructure and scale it to give organizations the best price performance so they can do their jobs, according to Aziza.

When it comes to additions, that concept emerged as the company was learning from customers innovating across all types of data and use cases, Aziza said.

“The first step was, let’s look at their specific workloads. Where do they start, and how do they mature? They typically start with easy workloads around reporting and so forth, and then they’ll mature,” he said. “There are even multi-region, more sophisticated needs.”

From there, the company created three additions to make it easy for organizations to say that a given workload would be associated with a standard edition, which would give it the basics needed to get started. Other workloads may need machine learning, and one of the functionalities of additions is that the capacity exists to mix and match, giving customers the flexibility and predictability of how much they are going to spend on the platform.

“Autoscaling is a way for us to give you the best capacity and the best price performance as we’re following your usage,” Aziza said. “This is kind of our way to bring, if you will, a key competitive advantage for us, which is artificial intelligence, where we can get a really good sense of how you’re using the platform and optimize all this functionality for you.”

Here’s theCUBE’s complete video interview with Bruno Aziza:

(* Disclosure: Google LLC sponsored this segment of theCUBE. Neither Google nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU