

Databricks Inc.’s vision for an enterprise world powered by artificial intelligence is based on vibes — or rather, vibe coding.
At Databricks’ annual conference this week, the company announced a series of data and AI products on its open analytics platform, including the serverless Lakebase database, Agent Bricks for automating agent building, and tools that enable coding with AI through natural language prompts, also called vibe coding.
The show floor at Databricks’ Data + AI Summit.
“The apps themselves are getting much better,” said George Gilbert (pictured, right), principal analyst for theCUBE Research. “This was something Snowflake pioneered with container services and native apps so that your application inherits the governance of the underlying data and platform. But here, [Databricks] goes a step further with the underlying Lakebase. They can support transactions.”
Gilbert spoke with theCUBE’s John Furrier (left) at the Databricks’ Data + AI Summit, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed takeaways from the keynote session and the development of Databricks’ flagship data lakehouse platform.
Databricks pioneered the data lakehouse, which combines the capabilities of a data warehouse and a data lake. Now, the data and AI company has integrated Lakebase, a transactional or online transaction processing or OLTP database based on PostgreSQL, which Databricks obtained through its recent acquisition of Neon Inc.
Lakebase’s integration into the platform simplifies the development of AI-driven applications by enabling customers to use a single set of controls to manage their data and analytics.
“They took a Postgres compatible database that they were working on that they also recently bought [and] … they separated compute from storage,” Gilbert explained. “The new wrinkle here is it can talk directly to open object store so you can get at the underlying tables and replicate them in bidirectionally into and out of the lakehouse.”
Databricks’ other major announcement was Agent Bricks, a tool for automatically optimizing AI agents during development. Users can request agents to do specific tasks through natural language prompts. Agent Bricks will then generate a series of AI “judges” to test the agents’ performance based on a given criteria.
“Agent Bricks … has these building blocks where you say, ‘I want an information extraction agent or knowledge assistant agent or multi-agent supervisor,’ and then something that … monitors those individual agents,” Gilbert said. “They track all the traces of the agents in ML flow and then they generate judges.”
With each new update, Databricks is moving toward AI being highly integrated with every layer of a customer’s company ecosystem. Going forward, Gilbert predicts that Databricks will build on the trend of vibe coding and live coding with AI tools that unify a customer’s data environments.
“You sort of tell the AI engineer, the software developer, an agent, what you want to do,” he said. “In the past, all it did was generate code. But if it’s now aware of what’s inside [Unity Catalog], it can talk to the Databricks tools. The vibe coding is starting to be an umbrella over the Databricks data and tools.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of the Databricks’ Data + AI Summit:
THANK YOU