Newsletters




Neo4j and AWS Collaborate to Drive Accurate, Transparent, and Explainable GenAI


Neo4j, one of the world’s leading graph database and analytics companies, has unveiled a multi-year Strategic Collaboration Agreement (SCA) with AWS aimed toward enabling enterprises to improve generative AI (GenAI) results with greater accuracy, transparency, and explainability. Leveraging a combination of knowledge graphs, native vector search, and AWS integrations, Neo4j and AWS are accelerating positive GenAI outcomes—with less hallucinations—while tackling its present challenges.

Building from Neo4j’s existing graph database with native vector search—which enables AI to reason, infer, and retrieve relevant information effectively—this partnership targets the common GenAI issue of needing long-term memory for domain-specific large language models (LLMs).

These LLMs, which can further complicate AI initiatives by delivering incomplete, inaccurate, or unverifiable information, are grounded by Neo4j’s graph database and knowledge database capabilities while simultaneously acting as a long-term memory solution.

Notably, the partnership integrates Neo4j’s knowledge graphs and vector search with Amazon Bedrock, a fully managed foundation model service that makes models from leading AI companies more accessible. This integration offers a wide variety of benefits to enhance AI initiatives, leveraging both Neo4j’s robust graph technology and AWS’ AI expertise.

“The possibilities are limitless when you combine the unparalleled creative power and language proficiency inherent in large language models with the data-intensive capabilities of knowledge graphs enriched with vector embeddings,” said Sudhir Hasbe, CPO, Neo4j. “It’s a harmonious blend similar to that of unlocking the potential of both the left and right brains.”

Neo4j with LangChain and Amazon Bedrock reduces AI hallucinations—a common and detrimental problem for every GenAI project—by using Retrieval Augmented Generation (RAG). This enables the creation of virtual assistants grounded in enterprise knowledge, ultimately empowering more accurate, transparent, and explainable AI results. 

“When developers want to reduce hallucinations and ground LLMs with a knowledge base that includes both connected data in a graph and vector embeddings, they need Neo4j,” said Hasbe. “They can simply plug in Amazon Bedrock and Neo4j along with…a chatbot application in a LangChain workflow, thereby providing both the foundational models and the grounding vector and knowledge graph.”

The SCA also furthers personalization with GenAI development, where Neo4j’s highly contextual knowledge graphs integrate with Amazon Bedrock’s ecosystem of foundation modelto deliver personalized text generation and summarization. Using Bedrock’s API for text completion, chatbot, and text generation/summarization, developers can customize these requests with unique prompts.

Neo4j’s Amazon Bedrock integration delivers complete answers during real-time search, leveraging Neo4j’s new vector search and store capability. Bedrock can be used to produce vector embeddings for unstructured data, which is then stored as properties in the same database and  knowledge graph. Semantic searches performed with LLMs then search against these stored vectors for efficient and accurate results.

Processing unstructured data is additionally enhanced with Amazon Bedrock, transforming the data into structured entities—such as a JSON object—that are then ingested into a knowledge graph. Once represented in the knowledge graph, the data can be used to surface insights and make real-time decisions. This enables developers to jumpstart knowledge graph generation in Neo4j, accelerating time-to-insights.

“By harnessing the generative AI power of Amazon Bedrock in conjunction with Neo4j knowledge graphs, organizations can bring institutional knowledge to all team members,” explained Hasbe. “This powerful combination not only unlocks insights from unstructured data but also ensures that responses are grounded in factual information, not hallucinations.”

Atop this variety of GenAI benefits provided by the Neo4j and Amazon Bedrock integration, Neo4j is announcing the general availability of Neo4j Aura Professional on the AWS Marketplace. This solution provides developers with a seamless, quick-start experience for creating GenAI while consolidating billing on their AWS accounts. Its launch on AWS allows development teams to eliminate the need for personal credit cards to fund experimentation, drives the use of existing AWS discounts and agreements, and promotes easy spending monitoring within a single AWS billing system.

“Through our Aura cloud offering on AWS Marketplace and the robust functionalities of our graph database, we are streamlining the journey for developers, empowering them to effortlessly ramp-up and be productive on the practical applications of real-world generative AI in the cloud. Our collaboration seeks to help customers unlock novel opportunities and reweave the fabric of possibility in the realm of enterprise innovation, and we look forward to seeing the results,” concluded Hasbe.

For more information about this news, visit https://neo4j.com or https://aws.amazon.com.


Sponsors