A simple API that connects to an Ollama model and MCP server for agent interactions.
-
Install requirements:
pip install -r requirements.txt
-
Ensure Ollama is running (default: http://localhost:11434)
- Set custom Ollama host with environment variable:
OLLAMA_HOST=http://your-ollama-host:11434
- Set custom Ollama host with environment variable:
-
Run the API server:
python mcp_client.py
Send a POST request to /message
endpoint with a JSON body:
curl -X POST http://localhost:8000/message \
-H "Content-Type: application/json" \
-d '{"message": "What is the weather in Paris?"}'
The response will be JSON with the agent's response:
{
"response": "..."
}
If you're experiencing issues with the FastAPI version, you can use the simpler single-shot client for testing:
python singleshot_client.py
This will prompt you to enter a message and will process it directly without using FastAPI.