UPDATED 09:00 EST / SEPTEMBER 10 2024

AI

Oracle boosts AI development features for analytics and HeatWave managed service

Oracle Corp. will use its Oracle CloudWorld conference today to announce new artificial intelligence capabilities spanning cloud database, analytics and development platform, led by a raft of new features for its HeatWave managed MySQL database service.

Saying some organizations are building generative AI applications with HeatWave and moving them into production in less than a month, Oracle said features such as in-database large language models reduce the need for AI expertise, manual integrations and complex security provisions. Customers using HeatWave on the Amazon Web Services Inc. cloud can replace up to six unique AWS services with a single HeatWave instance, the company said.

They can automate vector store creation and vector embedding generation, use in-database LLMs on CPUs or with models from Amazon Bedrock, and converse with documents in Amazon S3 at speeds that Oracle claimed exceed alternatives from vendors such as Snowflake Inc., Databricks Inc. and Google LLC.

A lakehouse — which is a unified data architecture that combines the scalability and flexibility of a data lake with the reliability and performance features of a data warehouse — built on top of HeatWave allows high-performance queries on structured, semistructured and unstructured data in Amazon S3 storage. It supports queries across hundreds of terabytes of data in object storage at database query speed without copying the data from the object store to the database.

Enhanced object storage support

New features in the lakehouse enable users to share query results and store them in object storage, as well as to use HeatWave for MapReduce applications. Changes to data in object storage are automatically detected, and updates are propagated to the lakehouse.

New native JavaScript support executes stored procedures and functions written in HeatWave’s native scripting language for uses like querying data in object storage. Autopilot indexing enables AWS users to predict the optimal set of indexes needed for their transaction processing workloads.

New generative AI features include support for documents in 27 languages within the vector store. Optical character recognition can be used to conduct similarity searches using scanned documents. LLM inference batch processing improves application throughput by executing multiple requests simultaneously across a cluster. Changes to documents in object storage automatically trigger updates to corresponding vector embeddings.

AutoML, a no-cost feature that Oracle said includes everything needed to build, train, and explain machine learning models, is getting a richer training data set with four times the capacity of the previous version. Topic modeling can be used to discover insights in large textual data sets by interpreting themes in documents. Data drift controls helps detect differences between the data used for training and new data. Semi-supervised log anomaly detection provides feedback on the results of unsupervised anomaly detection and uses the labeled data to improve subsequent predictions.

Oracle also said HeatWave is now available in the OCI Always Free Service, which can be used to build and run small-scale applications. All OCI accounts have unlimited access to a standalone HeatWave instance in their home region with 50 gigabytes of storage and 50 gigabytes of backup capacity.

Data lake for data intelligence

Oracle also said it plans to make its Intelligent Data Lake a foundational component of its Data Intelligence Platform beginning in 2025.
The Intelligent Data Lake is a new platform for storing, managing and analyzing large volumes of structured and unstructured data. It provides unified data management, AI-powered data discovery and cataloging capabilities, built-in governance tools and scalable storage.

The Data Intelligence Platform is integrated with Oracle’s Autonomous Data Warehouse, Analytics Cloud, HeatWave and various third-party services.  The addition of Intelligent Data Lake will provide a unified developer experience, including data cataloging capability, Apache Spark and Apache Flink compatibility and a Jupyter Notebook for data analysis and visualization, Oracle said. The combined services will enable customers to build a full-scale data lake, connect and extend analytical applications with real-time data from any source, inventory assets, transform data and fully orchestrate data with unified governance and security. The data lake will also offer zero copy, a technique that transfers data between different parts of a system without making multiple copies.

In addition, Oracle announced the availability of the Oracle Analytics Cloud AI Assistant, which translates natural language into actions. The Analytics Cloud AI Assistant understands the context of a user’s question using a built-in LLM optimized for analytics conversations and tasks that recognizes the Oracle Analytics workbook and datasets. Customers can optionally use other own LLMs.

Oracle is mostly playing catch-up with other cloud providers in enabling multi-LLM support and providing dedicated AI development tools, said Sig Nag, research vice president at Gartner Inc. “I didn’t see any massively revolutionary things,” in the first day’s announcements, he said. “They have created a mechanism for an API layer underneath other LLMs. Google has the same approach with Vertex AI, Gemini and 3rd parties. Oracle was partnered with Cohere [Inc.) and now they’ve added Llama and Hugging Face [Inc.] to the mix.”

A partnership announced with Amazon Web Services Inc. on Monday closes the loop on Oracle’s strategy to make its database and cloud infrastructure stack run on all major public cloud platforms. “They’re  now able to bring Oracle capabilities like Exadata [processing infrastructure] close to AWS users, so if they want to use an Oracle Autonomous Database and need massive amounts of power, they can can have that at their disposal,” Nag said.

AI-focused development

Oracle is also announcing generative development AI-centric application development infrastructure. It’s intended to provide a set of development technologies for building applications that use AI-powered natural language interfaces and human-centric data. GenDev combines technologies in Oracle Database 23ai, including JavaScript Object Notation Relational Duality Views, AI vector search and Oracle’s Apex low-code development platform.

JSON Relational Duality Views is a hybrid data management approach that bridges relational databases with JSON data formats. It allows developers to work with structured relational and semistructured JSON with flexible access, manipulation and storage.

AI Vector Search uses machine learning models to represent data — such as text, images and audio recordings — as multidimensional vectors. This allows for more nuanced searching capabilities compared to traditional keyword-based search methods, which are typically limited to exact pattern matches.

Oracle said the combination helps developers build AI applications more quickly and at lower risk. Data complexity is handled at the data layer, and application data rules covering such areas as intent, confidentiality, validation and integrity are enforced by the data engine. Oracle Database 23ai supports all data types and workloads without sacrificing the data consistency, performance and availability. Users can interact with data and applications using natural language and find data based on its semantic content.

Other features of Database 23ai that assist in AI development include precise responses to natural language questions using large language models and with enterprise data, a feature intended to reduce the risk of hallucinations. The database features built-in integration with 35 different LLMs across seven providers.

Users can access Nvidia Corp. graphic processing units without needing to provision or manage GPU servers. Oracle also provides a set of machine learning notebooks that use GPU-enabled Python packages for resource-intensive workloads, such as generating vector embeddings using transformer models and building deep learning models.

Users can prepare and load data using natural language and a visual drag-and-drop tool. They can also build operational property graph models without code with a built-in self-service tool. For developers working with the Autonomous Database in the cloud, these features are available at a flat hourly rate starting at 39 cents per hour. Autonomous Database for Developers is also available as a downloadable container image.

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU