MinIO debuts object storage built for AI workloads
MinIO Inc., developer of a high-performance, Kubernetes-native object store compatible with Amazon Web Services Inc.’s S3 service, today announced a version of its cloud storage service designed to support all types of artificial intelligence training data in one infrastructure.
Object storage is a highly scalable method for storing various structured and unstructured data types distributed across multiple hardware devices. MinIO says that more than half of Fortune 500 companies use its service.
Citing research that found that the top reasons organizations adopt object storage are to support AI initiatives and to provide public cloudlike performance and scalability, MinIO said its new AIStor is specifically tuned to workloads that require understanding the characteristics of the data being stored.
The new service includes an S3 application program interface called promptObject that lets users manage unstructured objects like they converse with a large language model. PromptObject function calls can be combined with chained functions to address multiple objects simultaneously. For example, a user can query about abnormalities on a stored MRI scan without going through an LLM. This enables developers to expand the capabilities of applications without requiring domain-specific knowledge of retrieval-automated generation models or vector databases, simplifying AI application development.
A private API compatible with Hugging Face Inc.’s repository of open-source AI models lets organizations create their own data and model repositories on a private cloud or in air-gapped environments without code changes, reducing the risk of data leakage.
A redesigned console user interface for managing MinIO storage supports identity and access management, information lifecycle management, load balancing, firewall, security, caching and orchestration from a single pane of glass. The console also features a Kubernetes operator that further simplifies the management of large-scale data infrastructure across hundreds of servers and tens of thousands of drives.
Support for S3 over remote direct memory access takes advantage of high-speed Ethernet networking by leveraging RDMA’s low-latency, high-throughput capabilities to improve performance with reduced CPU usage.
MinIO’s survey of 656 information technology executives found that an average of 70% of their cloud-native storage is object form today and that the average will grow to 75% in two years. The survey found that object stores are the primary foundation for advanced analytics, AI model training and data lakes/lakehouses. Over half of respondents plan to build a data lakehouse on an object storage foundation within the next 12 months and 41% use or plan to use object stores to support AI workloads.
In a blog post, MinIO said the survey results should shock makers of storage-area network/network-attached storage, which is valued for its low cost but doesn’t provide the cloud-native features that are needed for AI development. “SAN/NAS technologies are ill-suited for the cloud-native world and you can’t containerize an appliance,” wrote Jonathan Symonds, the company’s chief marketing officer.
Image: SiliconANGLE/DALL-E
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU