In a game-changing move, Docker, a widely-acclaimed software platform for creating container-based applications, has joined forces with Neo4j, LangChain, and Ollama to launch the Gen AI Stack, a comprehensive technology suite designed to facilitate developers in not only crafting applications but also effortlessly integrating generative AI.
This cutting-edge Gen AI Stack merges the graph and vector search capabilities inherent in Neo4j with LangChain orchestrations to create a powerful, function-packed platform that stands out among the likes of traditional code generating tools such as Copilot or Amazon CodeWhisperer.
With this strategic alliance, Docker's intention is to accelerate AI models and anchor large language models (LLMs) with Neo4j's knowledge graph, to generate more precise predictions. Further, LangChain orchestrations are anticipated to enable developers to interconnect the database with the vector index and the LLM with the application, thereby facilitating the construction of context-aware reasoning applications powered by the LLMs.
Additionally, the alliance with Ollama serves to assist developers in running open-source LLMs locally. The stack, which is bundled with preconfigured open-source LLMs, such as Llama 2, Code Llama, and Mistral, also includes a variety of support tools, illustrative code templates, knowledge base articles, and best practices for generative AI.
The innovative Gen AI Stack is presently available for developers and tech enthusiasts to get hands-on at the Learning Center in Docker Desktop and GitHub. This stack compliments platforms like AppMaster that make application development more accessible.
Moreover, Docker has rolled out a new, early-bird access program for their generative AI assistant named Docker AI. As per Docker CEO Scott Johnston, unlike other code-generating assistants, Docker AI assistant is designed to aid developers in defining and troubleshooting all aspects of an application, thus proving to be a valuable asset in application construction and maintenance.