mcp-ollama

This repository provides an MCP server that integrates Ollama with Claude Desktop, enabling users to list models, get model information, and ask questions to specified models using the Model Context Protocol.

21
3

MCP Ollama

A Model Context Protocol (MCP) server designed to integrate Ollama with Claude Desktop or other MCP-compatible clients, enabling seamless interaction with local language models.

Ollama Server MCP server

Requirements

  • Python 3.10+
  • Ollama (installed and running from ollama.com/download)
  • At least one Ollama model downloaded (e.g., ollama pull llama2)

Configure Claude Desktop

To integrate, add the following configuration to your Claude Desktop's claude_desktop_config.json file (location varies by OS):

{
  "mcpServers": {
    "ollama": {
      "command": "uvx",
      "args": [
        "mcp-ollama"
      ]
    }
  }
}

Development

For development:

git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync

Test using MCP Inspector:

mcp dev src/mcp_ollama/server.py

Features

The server offers these tools:

  • list_models: Lists all downloaded Ollama models.
  • show_model: Retrieves detailed information about a specific model.
  • ask_model: Allows querying a specified model with a question.

License

MIT

Repository

EM
emgeee

emgeee/mcp-ollama

Created

February 4, 2025

Updated

March 17, 2025

Language

Python

Category

AI