Metadata-Version: 2.4
Name: iflow-mcp_kartikk-26-mcp-server
Version: 0.4.0
Summary: Graphiti MCP Server
Requires-Python: <4,>=3.10
Description-Content-Type: text/markdown
Requires-Dist: mcp>=1.5.0
Requires-Dist: openai>=1.68.2
Requires-Dist: graphiti-core>=0.14.0
Requires-Dist: azure-identity>=1.21.0
Requires-Dist: graphiti-core


<p align="center">
  <a href="https://www.getzep.com/">
    <img src="https://github.com/user-attachments/assets/119c5682-9654-4257-8922-56b7cb8ffd73" width="150" alt="Zep Logo">
  </a>
</p>

<h1 align="center">Graphiti MCP Demo</h1>
<h3 align="center">🚀 Build Real-Time Knowledge Graphs for AI Agents</h3>

<div align="center">

![Made with Love](https://img.shields.io/badge/Made%20with-Love-red?style=flat-square)
![Neo4j](https://img.shields.io/badge/DB-Neo4j-blue?style=flat-square&logo=neo4j)
![Docker](https://img.shields.io/badge/Container-Docker-blue?style=flat-square&logo=docker)
![License](https://img.shields.io/badge/License-MIT-green?style=flat-square)

</div>

---

## 📑 Table of Contents

- [About](#-about)
- [Workflow](#-workflow-of-the-project)
- [Setup](#-setup)
- [Running MCP Server](#-running-mcp-server)
- [Integrating MCP Clients](#-integrating-mcp-clients)
- [Verifying in Neo4j](#-verifying-in-neo4j)
- [Final Output](#-final-output-from-cursor--neo4j)
- [Contribution](#-contribution)
- [License](#-license)

---

## 📖 About

We are implementing an **MCP server** and **AI agent integration** to leverage [Zep's Graphiti](https://www.getzep.com/) for persistent memory and context continuity across **Cursor** and **Claude**.  

This setup allows AI agents to:  
✅ Connect to the MCP for dynamic tool discovery  
✅ Select the optimal tool for a query  
✅ Formulate responses with context continuity  
✅ Persist interactions in **Neo4j** as a knowledge graph  

---

## 🔄 Workflow of the Project

The workflow of this project shows how **Cursor** or **Claude Desktop** integrates with the **MCP server** and stores context in **Graphiti memory (Neo4j)**:  

1. **Developer sends a Query** from Cursor IDE or Claude Desktop.  
2. The **MCP Host** connects to the **MCP Server**.  
3. The MCP Server makes **tool calls** (e.g., `add_episode`, `search_nodes`, `clear_graph`) to interact with Graphiti memory.  
4. Extracted **context** (documents, conversations, JSONs) is stored as structured data.  
5. This data flows into different layers of the **Graphiti Memory Structure**:  
   - **Level 1: Episodes** → Raw data like documents, conversations, JSONs  
   - **Level 2: Entities** → Nodes & relationships extracted from episodes  
   - **Level 3: Communities** → Clusters of entities with summaries  
6. The **MCP Host** sends the enriched **context** back to the developer as a response.  

### 📽️ *Workflow Demo*  
![Workflow](./Assests/Workflow.gif)


## ⚙️ Setup

### 1️⃣ Clone GitHub Repository

```bash
git clone https://github.com/getzep/graphiti.git
cd graphiti/mcp_server
````

### 2️⃣ Install Dependencies

```bash
uv sync
```

### 3️⃣ Configure Environment

Create a `.env` file in `graphiti/mcp_server`:

```dotenv
# Neo4j Database Configuration
NEO4J_URI=bolt://localhost:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=demodemo

# OpenAI API Configuration
OPENAI_API_KEY=<your_openai_api_key>
MODEL_NAME=gpt-4.1-mini
```

---

## 🖥 Running MCP Server

Graphiti MCP server can be run using **Docker** or **Python**.
Docker is recommended, but direct execution helps with troubleshooting.

### ▶️ Run with Docker

```bash
docker compose up
```

📸 *Docker Container Running*
![Docker Up](./Assests/docker.png)

---

### ▶️ Run with Python (for debugging)

```bash
uv run graphiti_mcp_server.py --model gpt-4.1-mini --transport sse
```

📸 *Graphiti SSE Output*
![SSE Output](./Assests/Graphiti%20SSE%20Output.png)

---

## 🤝 Integrating MCP Clients

### 🔹 Cursor

Add this to your `mcp.json`:

```json
{
  "mcpServers": {
    "Graphiti": {
      "url": "http://localhost:8000/sse"
    }
  }
}
```

---

### 🔹 Claude

Update `claude_desktop_config.json`:

```json
{
  "mcpServers": {
    "graphiti": {
      "transport": "stdio",
      "command": "/path/to/uv",
      "args": [
        "run",
        "--isolated",
        "--directory",
        "/path/to/graphiti/mcp_server",
        "--project",
        ".",
        "graphiti_mcp_server.py",
        "--transport",
        "stdio"
      ]
    }
  }
}
```

---

## 🕸 Verifying in Neo4j

Open the Neo4j browser → [http://localhost:7474/browser/](http://localhost:7474/browser/)

📸 *Connected Neo4j Browser*
![Neo4j Browser](./Assests/neo4j.png)

📸 *Data Stored in Neo4j*
![Neo4j Data](./Assests/datastored.png)

---

## 🔄 Final Output from Cursor → Neo4j

Flow:
**Cursor Prompt ➝ MCP Server ➝ Neo4j Graph Storage**

📸 *Final Cursor Output Sent to Neo4j*
![Final Output](./Assests/final-prompt-output.png)

---

## 🤝 Contribution

Contributions are welcome!

* Fork this repo
* Create a new branch
* Make changes & submit a PR

---

## 💡 Connect with Me  

Stay connected on LinkedIn for more projects, ideas, and collaborations:  
[Kartik Jain](https://linkedin.com/in/-kartikjain/)  

Let’s build, learn, and grow together! 🚀  

---
