Skip to content

gifflet/graphiti-mcp-server

Repository files navigation

Graphiti MCP Server 🧠

Python Version License Docker

🌟 A powerful knowledge graph server for AI agents, built with Neo4j and integrated with Model Context Protocol (MCP).

πŸš€ Features

  • πŸ”„ Dynamic knowledge graph management with Neo4j
  • πŸ€– Seamless integration with OpenAI models
  • πŸ”Œ MCP (Model Context Protocol) support
  • 🐳 Docker-ready deployment
  • 🎯 Custom entity extraction capabilities
  • πŸ” Advanced semantic search functionality

πŸ› οΈ Installation

Prerequisites

  • Docker and Docker Compose
  • Python 3.10 or higher
  • OpenAI API key
  • Minimum 4GB RAM (recommended 8GB)
  • 2GB free disk space

Quick Start πŸš€

  1. Clone the repository:
git clone https://github.com/gifflet/graphiti-mcp-server.git
cd graphiti-mcp-server
  1. Set up environment variables:
cp .env.sample .env
  1. Edit .env with your configuration:
# Required for LLM operations
OPENAI_API_KEY=your_openai_api_key_here
MODEL_NAME=gpt-4.1-mini

# Optional: Custom OpenAI endpoint (e.g., for proxies)
# OPENAI_BASE_URL=https://api.openai.com/v1

# Neo4j Configuration (defaults work with Docker)
NEO4J_URI=bolt://neo4j:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=demodemo
  1. Start the services:
docker compose up -d
  1. Verify installation:
# Check if services are running
docker compose ps

# Check logs
docker compose logs graphiti-mcp

Alternative: Environment Variables

You can run with environment variables directly:

OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up

πŸ”§ Configuration

Service Ports 🌐

Service Port Purpose
Neo4j Browser 7474 Web interface for graph visualization
Neo4j Bolt 7687 Database connection
Graphiti MCP 8000 MCP server endpoint

Environment Variables πŸ”§

OpenAI Configuration

Variable Required Default Description
OPENAI_API_KEY βœ… - Your OpenAI API key
OPENAI_BASE_URL ❌ - Custom OpenAI API endpoint (consumed by OpenAI SDK)
MODEL_NAME ❌ gpt-4.1-mini Main LLM model to use
SMALL_MODEL_NAME ❌ gpt-4.1-nano Small LLM model for lighter tasks
LLM_TEMPERATURE ❌ 0.0 LLM temperature (0.0-2.0)
EMBEDDER_MODEL_NAME ❌ text-embedding-3-small Embedding model

Neo4j Configuration

Variable Required Default Description
NEO4J_URI ❌ bolt://neo4j:7687 Neo4j connection URI
NEO4J_USER ❌ neo4j Neo4j username
NEO4J_PASSWORD ❌ demodemo Neo4j password

Server Configuration

Variable Required Default Description
MCP_SERVER_HOST ❌ - MCP server host binding
SEMAPHORE_LIMIT ❌ 10 Concurrent operation limit for LLM calls

Azure OpenAI Configuration (Optional)

For Azure OpenAI deployments, use these environment variables instead of the standard OpenAI configuration:

Variable Required Default Description
AZURE_OPENAI_ENDPOINT βœ…* - Azure OpenAI endpoint URL
AZURE_OPENAI_API_VERSION βœ…* - Azure OpenAI API version
AZURE_OPENAI_DEPLOYMENT_NAME βœ…* - Azure OpenAI deployment name
AZURE_OPENAI_USE_MANAGED_IDENTITY ❌ false Use Azure managed identity for auth
AZURE_OPENAI_EMBEDDING_ENDPOINT ❌ - Separate endpoint for embeddings
AZURE_OPENAI_EMBEDDING_API_VERSION ❌ - API version for embeddings
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME ❌ - Deployment name for embeddings
AZURE_OPENAI_EMBEDDING_API_KEY ❌ - Separate API key for embeddings

* Required when using Azure OpenAI

Notes:

  • OPENAI_BASE_URL is consumed directly by the OpenAI Python SDK, useful for proxy configurations or custom endpoints
  • SEMAPHORE_LIMIT controls concurrent LLM API calls - decrease if you encounter rate limits, increase for higher throughput
  • Azure configuration is an alternative to standard OpenAI - don't mix both configurations

Neo4j Settings πŸ—„οΈ

Default configuration for Neo4j:

  • Username: neo4j
  • Password: demodemo
  • URI: bolt://neo4j:7687 (within Docker network)
  • Memory settings optimized for development

Docker Environment Variables 🐳

You can run with environment variables directly:

OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up

For Azure OpenAI:

AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com \
AZURE_OPENAI_API_VERSION=2024-02-01 \
AZURE_OPENAI_DEPLOYMENT_NAME=your-deployment \
OPENAI_API_KEY=your_key \
docker compose up

πŸ”Œ Integration

Cursor IDE Integration πŸ–₯️

  1. Configure Cursor MCP settings:
{
  "mcpServers": {
    "Graphiti": {
      "command": "uv",
      "args": ["run", "graphiti_mcp_server.py"],
      "env": {
        "OPENAI_API_KEY": "your_key_here"
      }
    }
  }
}
  1. For Docker-based setup:
{
  "mcpServers": {
    "Graphiti": {
      "url": "http://localhost:8000/sse"
    }
  }
}
  1. Add Graphiti rules to Cursor's User Rules (see graphiti_cursor_rules.mdc)
  2. Start an agent session in Cursor

Other MCP Clients

The server supports standard MCP transports:

  • SSE (Server-Sent Events): http://localhost:8000/sse
  • WebSocket: ws://localhost:8000/ws
  • Stdio: Direct process communication

πŸ’» Development

Local Development Setup

  1. Install dependencies:
# Using uv (recommended)
curl -LsSf https://astral.sh/uv/install.sh | sh
uv sync

# Or using pip
pip install -r requirements.txt
  1. Start Neo4j locally:
docker run -d \
  --name neo4j-dev \
  -p 7474:7474 -p 7687:7687 \
  -e NEO4J_AUTH=neo4j/demodemo \
  neo4j:5.26.0
  1. Run the server:
# Set environment variables
export OPENAI_API_KEY=your_key
export NEO4J_URI=bolt://localhost:7687

# Run with stdio transport
uv run graphiti_mcp_server.py

# Or with SSE transport
uv run graphiti_mcp_server.py --transport sse --use-custom-entities

Testing

# Run basic connectivity test
curl http://localhost:8000/health

# Test MCP endpoint
curl http://localhost:8000/sse

πŸ” Troubleshooting

Common Issues

🐳 Docker Issues

# Clean up and restart
docker compose down -v
docker compose up --build

# Check disk space
docker system df

Logs and Debugging

# View all logs
docker compose logs -f

# View specific service logs
docker compose logs -f graphiti-mcp
docker compose logs -f neo4j

# Enable debug logging
docker compose up -e LOG_LEVEL=DEBUG

Performance Issues

  • Memory: Increase Neo4j heap size in docker-compose.yml
  • Storage: Monitor Neo4j data volume usage
  • Network: Check for firewall blocking ports 7474, 7687, 8000

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   MCP Client    β”‚    β”‚  Graphiti MCP    β”‚    β”‚     Neo4j       β”‚
β”‚   (Cursor)      │◄──►│     Server       │◄──►│   Database      β”‚
β”‚                 β”‚    β”‚   (Port 8000)    β”‚    β”‚  (Port 7687)    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                β”‚
                                β–Ό
                       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
                       β”‚   OpenAI API     β”‚
                       β”‚   (LLM Client)   β”‚
                       β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Components

  • Neo4j Database: Graph storage and querying
  • Graphiti MCP Server: API layer and LLM operations
  • OpenAI Integration: Entity extraction and semantic processing
  • MCP Protocol: Standardized AI agent communication

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

πŸ“ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments


Need help? Open an issue or check our troubleshooting guide above.

About

Graphiti MCP Server

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published