

Among the roughly 60 announcements Microsoft Corp. is making at its Build conference today are new artificial intelligence capabilities across its cloud-based database management products.
Fabric, the company’s unified data platform introduced last year, is a major beneficiary. A Workload Development Kit currently in preview can be used to extend applications within Fabric. Fabric Data Sharing is a new feature that works with real-time data across users and applications. It includes an application program interface for accessing data stored in external sources. A new Automation feature streamlines repetitive tasks.
A new RESTful GraphQL API lets Fabric developers access data from multiple sources with a single query. Expanded user data functions enable data-centric applications to be built in Fabric lakehouses, data warehouses and mirrored databases using native code ability and custom logic with simple integration.
AI skills in Fabric add generative AI capabilities, allowing even nontechnical users to build applications that answer questions in natural language, Microsoft said. The company also adds Azure OpenAI Services at every layer to create data flows and pipelines, generate code and build machine learning models.
A new Real-Time Intelligence feature in Fabric is a software-as-a-service application that creates a single place to ingest, process and route events from diverse sources. Event streams can be processed using preconfigured streaming connectors to cloud sources with content-based routing.
A Copilot for Fabric in private preview can be used to generate queries that detect unknown conditions beyond human scale in high-volume data.
Microsoft Azure Database for PostgreSQL is also getting AI capabilities leveraging either the Azure OpenAI Service or in-database models for those who want to keep their data within the database instance.
The Azure AI extension lets developers leverage large language models from Azure AI in their PostgreSQL applications. They can call the Azure OpenAI Service to generate LLM-based vector embeddings that allow efficient similarity searches and call the Azure AI Language for scenarios such as sentiment analysis, language detection and entity recognition.
Developers can also invoke pretrained machine learning models for scenarios such as fraud detection and product recommendations. Real-time text translation is available using Azure AI Translator.
In-database embedding generation capability supports text embedding models within Azure Database for PostgreSQL to generate embeddings within the database without calling into the Azure OpenAI Service. That reduces embedding creation time to single-digit millisecond latency with more predictable costs, Microsoft said.
CosmosDB, a globally distributed, multi-model database service for building large applications, is getting several AI-related updates. Cosmos DB for NoSQL now has built-in vector indexing and vector similarity search, allowing data and vectors to stay in sync without requiring a separate database. The feature is provided by DiskANN, an open-source set of approximate nearest-neighbor search algorithms, and is now in preview.
A new feature now in preview lets users transition their serverless Azure Cosmos DB accounts to provisioned capacity mode via the Azure portal or command line interface while retaining full access to data operations.
A new option now in preview lets Cosmos DB for MongoDB users create a continuously updated cluster replica in another region for failover purposes. A new Go software development kit enables operations on databases, containers and items with across multiple regions for high-availability applications.
THANK YOU