An automated analytics reporting tool that leverages MiraScope with Ollama, Cloudflare Workers, or Google Gemini to generate intelligent reports from Umami analytics data using the Model Control Protocol (MCP).
Blog post: https://www.rhelmer.org/blog/ai-powered-analytics-reports-using-mcp/
This project combines several powerful tools to create automated analytics reports:
- MiraScope: MCP client for orchestrating AI interactions
- Ollama/Cloudflare Workers/Google Gemini: LLM inference backends.
- Umami MCP Server: Connects to your Umami analytics instance to fetch website data
- Automated Reporting: Generates comprehensive analytics reports using AI
- 🤖 AI-Powered Analysis: Uses large language models to analyze website analytics data
- 📊 Comprehensive Reports: Generates detailed insights from your Umami analytics
- 🔄 Flexible Backends: Choose between local Ollama, Cloudflare Workers, or Google Gemini.
- 💬 Interactive Mode: Chat interface for exploring your analytics data
- 🚀 Easy Setup: Simple installation and configuration process
- Python 3.8+
- Access to an Umami analytics instance
- An AI provider:
- Ollama installed locally with a Llama model, OR
- A Cloudflare Workers account with AI access, OR
- A Google AI Studio API key for Gemini.
-
Clone the repository
git clone <your-repo-url> cd <your-repo-name>
-
Install dependencies
pip install uv
-
Set up environment variables
cp .env.example .env
Edit
.env
with your configuration:# Umami Configuration UMAMI_API_URL=https://your-umami-instance.com/api UMAMI_USERNAME=username UMAMI_PASSWORD=password UMAMI_TEAM_ID=your-team-id # Cloudflare AI Configuration CLOUDFLARE_ACCOUNT_ID=your-cloudflare-account-id CLOUDFLARE_API_TOKEN=your-cloudflare-api-token # Gemini API Configuration GEMINI_API_KEY=your-gemini-api-key
Clone the - Check the Umami MCP Server documentation for use in --mcp-server-dir ~/src/umami_mcp_server
flag (see below).
# Install Ollama
## Linux
curl -fsSL https://ollama.ai/install.sh | sh
## macOS
brew install ollama
# Start Ollama service
ollama serve
# Pull Llama model
ollama pull llama3.2
- Sign up for Cloudflare Workers
- Enable AI features in your account
- Get your API token and account ID
- Add credentials to
.env
file
- Go to Google AI Studio.
- Create an API key.
- Add the
GEMINI_API_KEY
to your.env
file. - Ensure you have Node.js and
npx
installed to use thegemini-cli
provider. The script will fall back to the REST API if the CLI is not available.
You can specify the AI provider using the --ai-provider
flag. Supported providers are cloudflare
, ollama
, and gemini-cli
.
Start an interactive session to explore your analytics data:
# Using Cloudflare (default)
uv run --with-requirements requirements.txt run.py --start-date 2024-01-01 --end-date 2024-01-31 --website example.com --mcp-server-dir ~/src/umami_mcp_server --chat
# Using Ollama
uv run --with-requirements requirements.txt run.py --start-date 2024-01-01 --end-date 2024-01-31 --website example.com --mcp-server-dir ~/src/umami_mcp_server --chat --ai-provider ollama
# Using Gemini
uv run --with-requirements requirements.txt run.py --start-date 2024-01-01 --end-date 2024-01-31 --website example.com --mcp-server-dir ~/src/umami_mcp_server --chat --ai-provider gemini-cli
Example interactions:
- "Show me a summary of last month's traffic"
- "What are my top pages this week?"
- "Generate a comprehensive monthly report"
- "Compare this month's performance to last month"
Generate specific reports directly:
# Custom date range with the default provider (Cloudflare)
uv run --with-requirements requirements.txt run.py --start-date 2024-01-01 --end-date 2024-01-31 --website example.com --mcp-server-dir ~/src/umami_mcp_server
# Specifying a different provider
uv run --with-requirements requirements.txt run.py --start-date 2024-01-01 --end-date 2024-01-31 --website example.com --mcp-server-dir ~/src/umami_mcp_server --ai-provider ollama
Set up automated report generation using cron:
# Add to crontab for weekly reports every Monday at 9 AM
0 9 * * 1 cd /path/to/project && uv run --with-requirements requirements.txt run.py --start-date 2024-01-01 --end-date 2024-01-31 --website example.com --mcp-server-dir ~/src/umami_mcp_server --ai-provider cloudflare
- Traffic Summary: Page views, unique visitors, sessions
- Top Content: Most popular pages and referrers
- Geographic Analysis: Visitor locations and demographics
- Device & Browser: Technology usage patterns
- Performance Trends: Growth metrics and comparisons
- Custom Insights: AI-generated observations and recommendations
Connection Errors
# Check Umami API connectivity
curl -u user:pass https://your-umami-instance.com/api/websites
Ollama Issues
# Verify Ollama is running
ollama list
ollama ps
├── run.py # Main application entry point
└── requirements.txt # Python dependencies
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
miracope
: MCP client frameworkumami-mcp-server
: Umami analytics MCP integrationpython-dotenv
: Environment variable management
For issues and questions:
- Create an issue in this repository
- Check the Umami MCP Server documentation
- Review MiraScope documentation
- Umami for the analytics platform
- Umami MCP Server for the Umami MCP server
- Ollama for local LLM inference
- Cloudflare for cloud AI services
- Google Gemini for the Gemini API.