UPDATED 15:31 EDT / JULY 16 2025

AI

In big AI agent push, AWS debuts new development tools, vector-optimized object store

Amazon Web Services Inc. is rolling out a new set of tools designed to help customers build artificial intelligence agents.

Swami Sivasubramanian (pictured), the cloud giant’s vice president of agentic AI, detailed the offerings today at the AWS Summit in New York.

AI agent environments

The first new offering that Sivasubramanian detailed during his keynote is called Amazon Bedrock AgentCore. It comprises a half dozen services designed to ease the task of building and maintaining AI agents.

AgentCore’s first component, AgentCore Runtime, provides cloud-based sandboxes for hosting AI agents. It allows agents to operate for up to eight hours per run, which makes it possible to automate time-consuming tasks such as analyzing large datasets. Each AgentCore sandbox can be configured with different security settings tailored to the workload it hosts.

If completing a task requires an agent to use an external system, developers can activate a service called AgentCore Gateway. It allows agents to access application programming interfaces, code snippets deployed on AWS Lambda and other external workloads. If some of those workloads require an agent to authenticate itself, a module called AgentCore Gateway makes it possible to do so using access management services such as Okta.

A code interpreter built into AgentCore allows AI agents to run the code they generate. Another tool, a cloud-based browser, enables agents to perform tasks that require interacting with websites. Developers can check that their AgentCore workloads run reliably using a service called AgentCore Observability.

“AgentCore provides a secure, serverless runtime with complete session isolation and the longest running workload available today, tools and capabilities to help agents execute workflows with the right permissions and context, and controls to operate trustworthy agents,” Sivasubramanian wrote in a blog post.

Vector storage

AgentCore-powered agents and other AI applications can keep their data in Amazon S3 Vectors, a new storage offering that also debuted at AWS Summit today. It’s optimized to store vectors, the mathematical structures in which neural networks encode their data. AWS says that the offering costs 90% less than alternative services.

S3 Vectors stores information in repositories called vector buckets. A vector bucket can hold up to 10,000 data structures called vector indexes. Each vector index, in turn, may contain tens of millions of vectors.

Customers can optionally enrich their records with metadata such as the date when a given vector was created. Such contextual information makes it easier for AI models to find relevant records in large datasets. According to AWS, S3 Vectors processes queries with sub-second latency. 

“As you write, update, and delete vectors over time, S3 Vectors automatically optimizes the vector data to achieve the best possible price-performance for vector storage, even as the datasets scale and evolve,” AWS principal developer advocate Channy Yun explained in a blog post.

S3 Vectors integrates with multiple AWS services including Amazon Bedrock, which offers access to a set of cloud-hosted foundation models. Some of the algorithms are developed by third-party providers such as Anthropic, while others are built by AWS itself. Companies can use the models to power their AI agents.

Going forward, the cloud giant will enable users to customize the Amazon Nova series of models that it offers through Bedrock. The series comprises more than a half dozen algorithms including several large language models. The other neural networks in the lineup, meanwhile, are geared towards tasks such as image generation.

AWS will enable customers to customize Nova models during both the pre- and post-training phases of the development workflow. The pre-training phase produces the base version of an AI model. Post-training, in term, is the umbrella term for the optimizations that engineers make to an AI model after initial development is complete.

AWS will support several AI customization methods. One of them is RLHF, a particularly widely-used technique whereby humans provide an LLM with feedback on the quality of prompt responses. This feedback helps the model refine its output. After customizing a model, customers can deploy it on Bedrock.

“Customers can now customize Nova Micro, Nova Lite, and Nova Pro across the model training lifecycle, including pre-training, supervised fine-tuning, and alignment,” AWS senior developer advocate Betty Zheng detailed in a blog post.

New AI tools

AWS announced the new offerings alongside a number of other AI-related updates. The AWS Marketplace now has a section dedicated to AI agents, tools and related offerings from the cloud giant’s partners. Nova Act, a Bedrock model that can perform actions in a browser, is receiving an enhanced software development kit with expanded cybersecurity features.

AWS is also releasing two new MCP servers. The first offers access to data about its APIs, while the other contains knowledge from its developer documentation. AI agents can use the MCP servers to incorporate that information into their prompt responses.

AWS will invest $100 million in its AWS Generative AI Innovation Center to help customers with their AI projects. The business unit, which was formed in 2023, provides customers with access to AI researchers, engineers and other technical experts. AWS disclosed on occasion of the investment that the unit has completed AI projects for thousands of customers since launching. 

Photo: AWS

A message from John Furrier, co-founder of SiliconANGLE:

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  
CUBE Alumni Network
C-level and Technical
Domain Experts
15M+ 
theCUBE
Viewers
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.