Model Context Protocol (MCP) server for Recruitee – advanced search, reporting, and analytics for recruitment data.
The Model Context Protocol (MCP) is rapidly becoming the standard for connecting AI agents to external services. This project implements an MCP server for Recruitee, enabling advanced, AI-powered search, filtering, and reporting on recruitment data.
Unlike basic CRUD wrappers, this server focuses on the tasks where LLMs and AI agents excel: summarizing, searching, and filtering. It exposes a set of tools and prompt templates, making it easy for any MCP-compatible client to interact with Recruitee data in a structured, agent-friendly way.
-
Advanced Candidate Search & Filtering
Search for candidates by skills, status, talent pool, job, tags, and more. Example:
"Find candidates with Elixir experience who were rejected due to salary expectations." -
Recruitment Summary Reports
Generate summaries of recruitment activities, such as time spent in each stage, total process duration, and stage-by-stage breakdowns. -
Recruitment Statistics
Calculate averages and metrics (e.g., average expected salary for backend roles, average time to hire, contract type stats). -
General Search
Quickly find candidates, recruitments, or talent pools by name or attribute. -
GDPR Compliance
(Planned) Automatic deletion of personal data after 2 years, configurable per talent pool or recruitment. -
Prompt Templates
Exposes prompt templates for LLM-based clients, ensuring consistent and high-quality summaries.
- Find candidates with Elixir experience who were rejected due to salary expectations.
- Show me their personal details including CV URL.
- Why was candidate 'X' disqualified and at what stage?
- What are the other stages for this offer?
- Language: Python
- Framework: FastMCP
- API: Recruitee Careers Site API
- Schemas: All MCP tool schemas are generated from Pydantic models, with rich metadata for LLMs.
The server retrieves and processes data from Recruitee, exposing it via MCP tools. Summaries are composed by the client using provided prompt templates.
- stdio – For local development and testing.
- streamable-http – For remote, production-grade deployments (recommended).
- SSE – Supported but deprecated in some MCP frameworks.
-
Configure your MCP client:
{ "mcpServers": { "recruitee": { "command": "/path/to/.venv/bin/python", "args": ["/path/to/recruitee-mcp-server/src/app.py", "--transport", "stdio"] } } }
-
Run with mcp-cli:
mcp-cli chat --server recruitee --config-file /path/to/mcp-cli/server_config.json
-
Configure your MCP client:
{ "mcpServers": { "recruitee": { "transport": "streamable-http", "url": "https://recruitee-mcp-server.fly.dev/mcp" } } }
-
Or use mcp-remote for free-tier clients:
{ "mcpServers": { "recruitee": { "command": "npx", "args": [ "mcp-remote", "https://recruitee-mcp-server.fly.dev/mcp/", "--header", "Authorization: Bearer ${MCP_BEARER_TOKEN}" ], "env": { "MCP_BEARER_TOKEN": "KEY" } } } }
-
Set your secrets in `.env
-
Deploy:
flyctl auth login make deploy
- The server is live at: https://recruitee-mcp-server.fly.dev/
Contributions, issues, and feature requests are welcome!
See CONTRIBUTING.md for details.
This project is MIT licensed.
Empower your AI agents with advanced recruitment data access and analytics.