This repository provides an MCP server that integrates Ollama with Claude Desktop, enabling users to list models, get model information, and ask questions to specified models using the Model Context Protocol.
A Model Context Protocol (MCP) server designed to integrate Ollama with Claude Desktop or other MCP-compatible clients, enabling seamless interaction with local language models.
ollama pull llama2
)To integrate, add the following configuration to your Claude Desktop's claude_desktop_config.json
file (location varies by OS):
{
"mcpServers": {
"ollama": {
"command": "uvx",
"args": [
"mcp-ollama"
]
}
}
}
For development:
git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync
Test using MCP Inspector:
mcp dev src/mcp_ollama/server.py
The server offers these tools:
list_models
: Lists all downloaded Ollama models.show_model
: Retrieves detailed information about a specific model.ask_model
: Allows querying a specified model with a question.MIT
emgeee/mcp-ollama
February 4, 2025
March 17, 2025
Python