UPDATED 11:00 EDT / NOVEMBER 15 2023

AI

Microsoft turbocharges AI development with Fabric, Azure AI Studio and database updates

Microsoft Corp. is raising its game for artificial intelligence developers with a host of updates to its Azure cloud computing platform.

At its annual Ignite 2023 conference, the company explained that the success of any AI initiative is limited by the data that’s available to it, so one of the main focuses is on the integration of data and AI. To that end, the company announced the general availability of a new offering called Microsoft Fabric in the Microsoft Intelligent Data Platform, bringing together various databases, analytics and governance services to ensure companies can integrate their most vital data assets with their AI operations.

Check out our full coverage of Microsoft Ignite in these stories:

The company also announced the launch of Azure AI Studio, which brings innovations such as enhanced AI search capabilities and content safety.

Microsoft Fabric now available

In a blog post, Jessica Hawk, Azure’s corporate vice president of data, AI, digital applications and marketing, explained that data is the foundation for every AI application. As such, companies need a comprehensive data estate to fuel AI innovation, but this can be challenging for them to build in an era of fragmented data systems.

Microsoft Fabricannounced in May during the company’s annual developer conference Build 2023, brings together all of the data and analytics tools needed to build cutting edge AI applications and systems, according to the company. It integrates platforms, including Data Factory, Synapse and Power BI, into a single software-as-a-service product, replacing those disparate systems with what it says is a simple, easy to manage and cost-effective offering for building and integrating AI.

Microsoft says Fabric covers everything data professionals require, with tools for data integration, data engineering, data warehousing, data science, real-time analytics, applied observability and business intelligence. Microsoft already offers a comprehensive AI development platform, Azure OpenAI, for building all manner of next-generation AI experiences, but building these applications requires a steady stream of clean data and integrated analytics.

With Fabric, developers can now access a complete data platform, as opposed to a complex labyrinth of disconnected tools, the company said. It provides all of the capabilities needed to extract insights from data and make it available to AI systems, and soon it will integrate its own copilot tool in preview to enable developers to interact with it using natural language commands.

Microsoft Fabric is built atop an open data lake platform, called OneLake, which provides a single source of truth and eliminates the need to extract, move or replicate data. Through OneLake, Microsoft said, Fabric also enables persistent data governance and a single capacity pricing model that scales up as usage grows, while its open nature removes the risk of proprietary lock-in. Other capabilities include native integration with Microsoft 365 apps such as Excel, Dynamics 365 and Microsoft Teams.

Azure AI Studio

Alongside Fabric, Microsoft unveiled a public preview of Azure AI Studio for AI developers, providing everything needed to tackle generative AI models in one place. It facilitates access to the latest large language models, data integration to power retrieval augmented generation, which is the name for AI systems that can leverage company’s private datasets, intelligent search, full-lifecycle model management and content safety tools.

Azure AI Studio’s most important feature is Prompt Flow, a tool for managing prompt orchestration and LLMOps. Also available in the Azure Machine Learning service, Prompt Flow simplifies the process of prototyping, experimenting, iterating and deploying AI applications, according to Microsoft.

Azure AI Search, formerly known as Cognitive Search, is also available in Azure AI Studio. It enables effective data retrieval to improve the quality of AI responses and reduce latency. It’s used to power a technique called retrieval-augmented generation that allows LLMs to integrate data from additional sources and improve the quality of their responses.

RAG is essential for certain kinds of applications, such as customer service agents, that must generate reliable answers based on accurate information. The technique is powered by vector search, a method of searching for information within various data types, including image, audio, text and video. Vector search is an essential capability to ensure AI has access to more comprehensive data.

Azure AI Search also boasts a capability called semantic ranker, which provides access to the same search re-ranking technology that powers Microsoft Bing’s search results, ranking responses based on their relevance.

Azure AI Content Safety, meanwhile, provides developers with tools to evaluate model responses, ensure their accuracy and eliminate bias and other security risks. Its main features are the Responsible AI Dashboard, model monitoring and a commitment by Microsoft to defend and indemnify Azure customers from lawsuits alleging copyright infringement.

The Customer Copyright Commitment is available to all customers using the Azure OpenAI Service, and is a testament to Microsoft’s confidence in its abilities to safeguard AI development. “By extending the CCC to Azure OpenAI Service, Microsoft is broadening our commitment to defend our commercial customers and pay for any adverse judgments if they are sued for copyright infringement for using Azure OpenAI Service or the output it generates,” Hawk said.

AI updates everywhere

Although all those announcements are enormously comprehensive, Microsoft has plenty more to offer AI developers. For instance, it announced the coming availability of GPT-4 Turbo with Vision in Azure OpenAI Service and Azure AI Studio. It’s the most powerful LLM offered by Microsoft’s partner OpenAI LP yet, giving developers a way to “unlock multi-modal capabilities” in their AI applications. By that, it means AI applications that can use object detection to see, understand and make inferences based on what they see, whether that’s in the real world or videos or somewhere else.

The Azure Cosmos DB database service is getting a ton of enhancements too, as the company bids to make that platform the database of choice for AI developers. It now supports dynamic scaling, or the flexibility to scale databases up or down based on demand, in order to ensure they’re always using the most optimal cloud resources to optimize their cloud costs.

Other new features include the general availability of Azure Cosmos DB for MongoDB vCore and vector search in Azure Cosmos DB for MongoDB vCore. With this, developers can now build intelligent applications with full support for MongoDB data that’s stored in Azure Cosmos DB. It means MongoDB developers will benefit from native Azure integrations and a lower total cost of ownership while enjoying the familiar experience of that database service.

As for vector search, this makes it possible to use Azure Cosmos DB to store unstructured data as embeddings. That eliminates the need to transfer data from the platform to a specialized platform with vector search capabilities.

Additionally, Microsoft is enhancing the Azure Kubernetes Service, a managed version of the open-source Kubernetes tool that’s used to orchestrate software containers, which host the components of AI applications.

The company said AKS now supports specialized machine learning workloads such as LLMs with minimal configuration required to get them up and running. The new Kubernetes AI toolchain operator automates LLM model deploying on AKS across the available central processing unit and graphics processing units.

It works by selecting the most optimal infrastructure for each model, based on the choices available. In this way, developers can split inferencing costs across multiple lower-GPU count virtual machines. Not only does this lower costs, but it also increases the number of regions where LLM workloads can be run, while eliminating wait times for higher-GPU count VMs. Customers can also run preset models hosted on AKS, reducing the overall inference service setup time, the company said.

Also in AKS, there’s a new Azure Kubernetes Fleet Manager that can be used to manage workload distribution across Kubernetes clusters and facilitate platform and application updates, so developers will know they are running the latest and most secure software.

Finally, Microsoft announced the latest version of its .NET computing framework for Windows, Linux and macOS operating systems. It says the new version, .NET 8, delivers a leap forward in terms of performance and productivity for ASP.NET and .NET developers, laying the groundwork for a new generation of intelligent cloud-native applications that can be tightly integrated with Azure.

Its new features include support for Azure Functions, Azure App Service, AKS and Azure Container Apps, as well as updates to Visual Studio, its integrated development environment, and integrations with GitHub and Microsoft DevBox.

Images: Microsoft

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU