# cognee **Repository Path**: gdfan/cognee ## Basic Information - **Project Name**: cognee - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2026-04-06 - **Last Updated**: 2026-04-06 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
Cognee Logo
Cognee - Build AI memory with a Knowledge Engine that learns

Demo . Docs . Learn More · Join Discord · Join r/AIMemory . Community Plugins & Add-ons

[![GitHub forks](https://img.shields.io/github/forks/topoteretes/cognee.svg?style=social&label=Fork&maxAge=2592000)](https://GitHub.com/topoteretes/cognee/network/) [![GitHub stars](https://img.shields.io/github/stars/topoteretes/cognee.svg?style=social&label=Star&maxAge=2592000)](https://GitHub.com/topoteretes/cognee/stargazers/) [![GitHub commits](https://badgen.net/github/commits/topoteretes/cognee)](https://GitHub.com/topoteretes/cognee/commit/) [![GitHub tag](https://badgen.net/github/tag/topoteretes/cognee)](https://github.com/topoteretes/cognee/tags/) [![Downloads](https://static.pepy.tech/badge/cognee)](https://pepy.tech/project/cognee) [![License](https://img.shields.io/github/license/topoteretes/cognee?colorA=00C586&colorB=000000)](https://github.com/topoteretes/cognee/blob/main/LICENSE) [![Contributors](https://img.shields.io/github/contributors/topoteretes/cognee?colorA=00C586&colorB=000000)](https://github.com/topoteretes/cognee/graphs/contributors) Sponsor

topoteretes%2Fcognee | Trendshift

Use our knowledge engine to build personalized and dynamic memory for AI Agents.

🌐 Available Languages : Deutsch | Español | Français | 日本語 | 한국어 | Português | Русский | 中文

Why cognee?
## About Cognee Cognee is an open-source knowledge engine that lets you ingest data in any format or structure and continuously learns to provide the right context for AI agents. It combines vector search, graph databases and cognitive science approaches to make your documents both searchable by meaning and connected by relationships as they change and evolve. :star: _Help us reach more developers and grow the cognee community. Star this repo!_ :books: _Check our detailed [documentation](https://docs.cognee.ai/getting-started/installation#environment-configuration) for setup and configuration._ :crab: _Available as a plugin for your OpenClaw — [cognee-openclaw](https://www.npmjs.com/package/@cognee/cognee-openclaw)_ ### Why use Cognee: - Knowledge infrastructure — unified ingestion, graph/vector search, runs locally, ontology grounding, multimodal - Persistent and Learning Agents - learn from feedback, context management, cross-agent knowledge sharing - Reliable and Trustworthy Agents - agentic user/tenant isolation, traceability, OTEL collector, audit traits ### Product Features

Cognee Products

## Basic Usage & Feature Guide To learn more, [check out this short, end-to-end Colab walkthrough](https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing) of Cognee's core features. [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing) ## Quickstart Let’s try Cognee in just a few lines of code. ### Prerequisites - Python 3.10 to 3.13 ### Step 1: Install Cognee You can install Cognee with **pip**, **poetry**, **uv**, or your preferred Python package manager. ```bash uv pip install cognee ``` ### Step 2: Configure the LLM ```python import os os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY" ``` Alternatively, create a `.env` file using our [template](https://github.com/topoteretes/cognee/blob/main/.env.template). To integrate other LLM providers, see our [LLM Provider Documentation](https://docs.cognee.ai/setup-configuration/llm-providers). ### Step 3: Run the Pipeline Cognee will take your documents, load them into the knowledge angine and search combined vector/graph relationships. Now, run a minimal pipeline: ```python import cognee import asyncio from pprint import pprint async def main(): # Add text to cognee await cognee.add("Cognee turns documents into AI memory.") # Add to knowledge engine await cognee.cognify() # Query the knowledge graph results = await cognee.search("What does Cognee do?") # Display the results for result in results: pprint(result) if __name__ == '__main__': asyncio.run(main()) ``` As you can see, the output is generated from the document we previously stored in Cognee: ```bash Cognee turns documents into AI memory. ``` ### Use the Cognee CLI As an alternative, you can get started with these essential commands: ```bash cognee-cli add "Cognee turns documents into AI memory." cognee-cli cognify cognee-cli search "What does Cognee do?" cognee-cli delete --all ``` To open the local UI, run: ```bash cognee-cli -ui ``` ## Examples Browse more examples in the [`examples/`](examples/) folder — demos, guides, custom pipelines, and database configurations. **Use Case 1 — Customer Support Agent** ```python Goal: Resolve customer issues using their personal data across finance, support, and product history. User: "My invoice looks wrong and the issue is still not resolved." Cognee tracks: past interactions, failed actions, resolved cases, product history # Agent response: Agent: "I found 2 similar billing cases resolved last month. The issue was caused by a sync delay between payment and invoice systems — a fix was applied on your account." # What happens under the hood: - Unifies data sources from various company channels - Reconstructs the interaction timeline and tracks outcomes - Retrieves similar resolved cases - Maps to the best resolution strategy - Updates memory after execution so the agent never repeats the same mistake ``` **Use Case 2 — Expert Knowledge Distillation (SQL Copilot)** ```python Goal: Help junior analysts solve tasks by reusing expert-level queries, patterns, and reasoning. User: "How do I calculate customer retention for this dataset?" Cognee tracks: expert SQL queries, workflow patterns, schema structures, successful implementations # Agent response: Agent: "Here's how senior analysts solved a similar retention query. Cognee matched your schema to a known structure and adapted the expert's logic to fit your dataset." # What happens under the hood: - Extracts and stores patterns from expert SQL queries and workflows - Maps the current schema to previously seen structures - Retrieves similar tasks and their successful implementations - Adapts expert reasoning to the current context - Updates memory with new successful patterns so junior analysts perform at near-expert level ``` ## Deploy Cognee Use [Cognee Cloud](https://www.cognee.ai) for a fully managed experience, or self-host with one of the 1-click deployment configurations below. | Platform | Best For | Command | |----------|----------|---------| | **Cognee Cloud** | Managed service, no infrastructure to maintain | [Sign up](https://www.cognee.ai) | | **Modal** | Serverless, auto-scaling, GPU workloads | `bash distributed/deploy/modal-deploy.sh` | | **Railway** | Simplest PaaS, native Postgres | `railway init && railway up` | | **Fly.io** | Edge deployment, persistent volumes | `bash distributed/deploy/fly-deploy.sh` | | **Render** | Simple PaaS with managed Postgres | Deploy to Render button | | **Daytona** | Cloud sandboxes (SDK or CLI) | See `distributed/deploy/daytona_sandbox.py` | See the [`distributed/`](distributed/) folder for deploy scripts, worker configurations, and additional details. ## Latest News [![Watch Demo](https://img.youtube.com/vi/8hmqS2Y5RVQ/maxresdefault.jpg)](https://www.youtube.com/watch?v=8hmqS2Y5RVQ&t=13s) ## Community & Support ### Contributing We welcome contributions from the community! Your input helps make Cognee better for everyone. See [`CONTRIBUTING.md`](CONTRIBUTING.md) to get started. ### Code of Conduct We're committed to fostering an inclusive and respectful community. Read our [Code of Conduct](https://github.com/topoteretes/cognee/blob/main/CODE_OF_CONDUCT.md) for guidelines. ## Research & Citation We recently published a research paper on optimizing knowledge graphs for LLM reasoning: ```bibtex @misc{markovic2025optimizinginterfaceknowledgegraphs, title={Optimizing the Interface Between Knowledge Graphs and LLMs for Complex Reasoning}, author={Vasilije Markovic and Lazar Obradovic and Laszlo Hajdu and Jovan Pavlovic}, year={2025}, eprint={2505.24478}, archivePrefix={arXiv}, primaryClass={cs.AI}, url={https://arxiv.org/abs/2505.24478}, } ```