Docker debuts GenAI developer stack and its first AI coding assistant
Container development tooling provider Docker Inc. said today it’s working with multiple partners to offer a new generative artificial intelligence development stack for developers to get started on building AI applications in minutes.
Docker’s GenAI Stack was unveiled alongside the company’s first generative AI-powered coding assistant, which aims to boost developer productivity.
Announced at DockerCon today in Los Angeles, Docker’s new GenAI Stack was made possible through partnerships with Neo4J Inc., LangChain Inc. and the open-source project Ollama. It’s a preconfigured, ready-to-code and highly secure platform that provides access to large language models from Ollama, Neo4J’s vector and graph database, and the LangChain development framework.
Meanwhile, Docker provides the tooling required to build the applications powered by the new generative AI models. It offers a suite of popular tools for developing container-based apps. Containers are used to host the components of modular applications that can run on any platform. With Docker, developers can create and test containers on local machines and share code, and later deploy them on the infrastructure of their choice.
Docker said the main advantage of its GenAI Stack is that it eliminates the need to search for and cobble together the technologies necessary to support generative AI models, which stand out for their ability to generate content and images, among other things.
Docker Chief Executive Scott Johnston said developers are excited about generative AI but are not clear how to get started because of the variety of technology stacks available. “Today’s announcement eliminates this dilemma by enabling developers to get started quickly and safely using the Docker tools, content, and services they already know and love together with partner technologies on the cutting edge of GenAI app development,” he said.
Docker said its GenAI Stack, which is available on GitHub starting today, provides developers with everything they need to get started building generative AI applications. It comes with preconfigured, open-source LLMs such as Llama 2, Code Llama, Mistral and private models such as OpenAI LP’s GPT-3.5 and GPT-4. Ollama will provide the support developers need to work with those models and customize them, while Neo4J provides the default database with graph and vector search capabilities.
That’s important, because graphs and vector embeddings, which are numerical representations of unstructured data, are used to uncover patterns and relationships in data, helping AI find accurate answers more quickly. Meanwhile, LangChain’s AI development framework helps orchestrate the underlying LLM, the database and the application being built. As for Docker, it provides the supporting tools, code templates and everything else needs for app development.
Docker promised that its GenAI Stack makes life especially simple for developers with simple setup, effortless data loading and vector index population, making it easy to prepare and format the data they want to use to customize the underlying LLMs.
RedMonk co-founder and Principal Analyst James Governor said that until last year, AI development was a specialist field. However, with the buzz around ChatGPT and other applications, AI has become something that every company and developer is interested in. The problem now, he said, is that many are confused by the fragmented landscape of generative AI developer tooling.
“Great packaging is going to be needed before general, broad-based adoption by developers for building AI-driven apps really takes off,” the analyst said “The GenAI Stack that Docker, Neo4j, LangChain and Ollama are collaborating to offer provides the kind of consistent and unified experience that makes developers productive with new tools and methods.”
Generative AI paves the way to increased developer velocity
Docker isn’t trying to make just generative AI development easy, but also regular application development. In addition to the GenAI Stack, it announced a new product called DockerAI, which provides automated and context-specific guidance to developers as they work.
Available in early access now, DockerAI can be thought of as a code generation assistant, similar to tools such as GitHub’s Copilot and Tabnine. It’s designed to provide context-aware guidance to developers as they’re editing Dockerfiles or Docker Compose Files, debugging applications or running a local test. Docker says its AI helps developers to benefit from the collective wisdom from the millions of developers who are using its tools every day.
Johnston said tools such as GitHub’s Copilot are mostly focused on writing application source code. “In addition to the source code, applications are made up of web servers, language runtimes, databases, message queues, and many other technologies,” he explained. “Docker AI helps developers define and troubleshoot all aspects of the app quickly and securely as they iterate in their ‘inner loop.’”
Image: Docker
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU