Ollama MCP client: TUI for managing local LLMs. Multi-server, streaming, tool management, & full model config. Dev-focused.
<p align="center">
<img src="https://github.com/jonigl/mcp-client-for-ollama/blob/main/misc/ollmcp-logo-512.png?raw=true" width="256" alt="ollmcp Logo"/>
</p>
<p align="center">
<i>A simple yet powerful Python client for interacting with Model Context Protocol (MCP) servers using Ollama, allowing local LLMs to use tools.</i>
</p>
---
# MCP Client for Ollama (ollmcp)
[](https://www.python.org/downloads/)
[](https://pypi.org/project/ollmcp/)
[](https://pypi.org/project/mcp-client-for-ollama/)
[](https://github.com/jonigl/mcp-client-for-ollama/actions/workflows/publish.yml)
[](https://github.com/jonigl/mcp-client-for-ollama/actions/workflows/ci.yml)
<p align="center">
<img src="https://raw.githubusercontent.com/jonigl/mcp-client-for-ollama/v0.15.0/misc/ollmcp-demo.gif" alt="MCP Client for Ollama Demo">
</p>
<p align="center">
<a href="https://asciinema.org/a/jxc6N8oKZAWrzH8aK867zhXdO" target="_blank">🎥 Watch this demo as an Asciinema recording</a>
</p>
## Table of Contents
- [Overview](#overview)
- [Features](#features)
- [Requirements](#requirements)
- [Quick Start](#quick-start)
- [Usage](#usage)
- [Command-line Arguments](#command-line-arguments)
- [Usage Examples](#usage-examples)
- [Interactive Commands](#interactive-commands)
- [Tool and Server Selection](#tool-and-server-selection)
- [Model Selection](#model-selection)
- [Advanced Model Configuration](#advanced-model-configuration)
- [Server Reloading for Development](#server-reloading-for-development)
- [Human-in-the-Loop (HIL) Tool Execution](#human-in-the-loop-hil-tool-execution)
- [Performance Metrics](#performance-metrics)
- [Autocomplete and Prompt Features](#autocomplete-and-prompt-features)
- [Configuration Management](#configuration-management)
- [Server Configuration Format](#server-configuration-format)
- [Compatible Models](#compatible-models)
- [Finding MCP Servers](#finding-mcp-servers)
- [Related Projects](#related-projects)
- [License](#license)
- [Acknowledgments](#acknowledgments)
## Overview
MCP Client for Ollama (`ollmcp`) is an interactive terminal application (TUI) designed to seamlessly connect local Ollama Large Language Models (LLMs) with Model Context Protocol (MCP) servers. This integration empowers LLMs with advanced tool use and workflow automation capabilities. The client provides a user-friendly interface for managing tools, models, and server connections in real-time, eliminating the need for manual coding. Whether you're developing, testing, or simply exploring LLM tool integration, `ollmcp` streamlines your workflow with features like fuzzy autocomplete, advanced model configuration, hot-reloading of MCP servers for development, and Human-in-the-Loop (HIL) safety controls.
## Features
- 🌐 **Multi-Server Support**: Connect to multiple MCP servers simultaneously, expanding the range of available tools.
- 🚀 **Multiple Transport Types**: Supports STDIO, Server-Sent Events (SSE), and Streamable HTTP server connections for flexible integration.
- 🎨 **Rich Terminal Interface**: Interactive console UI for easy navigation and control.
- 🌊 **Streaming Responses**: View model outputs in real-time as they are generated.
- 🛠️ **Tool Management**: Enable/disable specific tools or entire servers during chat sessions to tailor functionality.
- 🧑💻 **Human-in-the-Loop (HIL)**: Review and approve tool executions before they run for enhanced control and safety, especially important for potentially sensitive operations.
- 🎮 **Advanced Model Configuration**: Fine-tune over 10 model parameters, including temperature, sampling, and repetition control, to optimize model behavior.
- 💬 **System Prompt Customization**: Define and edit the system prompt to control model behavior and persona, influencing the model's responses.
- 🎨 **Enhanced Tool Display**: Beautiful, structured visualization of tool executions with JSON syntax highlighting for easy debugging.
- 🧠 **Context Management**: Control conversation memory with configurable retention settings, allowing for short-term or long-term memory.
- 🤔 **Thinking Mode**: Advanced reasoning capabilities with visible thought processes for supported models (e.g., gpt-oss, deepseek-r1, qwen3, etc.), providing insight into the model's decision-making process.
- 🗣️ **Cross-Language Support**: Seamlessly work with both Python and JavaScript MCP servers, broadening compatibility.
- 🔍 **Auto-Discovery**: Automatically find and use Claude's existing MCP server configurations for quick setup.
- 🔁 **Dynamic Model Switching**: Switch between any installed Ollama model without restarting the client.
- 💾 **Configuration Persistence**: Save and load tool preferences between sessions for consistent behavior.
- 🔄 **Server Reloading**: Hot-reload MCP servers during development without restarting the client, enabling rapid iteration.
- ✨ **Fuzzy Autocomplete**: Interactive, arrow-key command autocomplete with descriptions for efficient command entry.
- 🏷️ **Dynamic Prompt**: Shows current model, thinking mode, and enabled tools in the prompt for clear context.
- 📊 **Performance Metrics**: Detailed model performance data after each query, including duration timings and token counts, for performance analysis.
- 🔌 **Plug-and-Play**: Works immediately with standard MCP-compliant tool servers.
- 🔔 **Update Notifications**: Automatically detects when a new version is available.
- 🖥️ **Modern CLI with Typer**: Grouped options, shell autocompletion, and improved help output for a user-friendly command-line experience.
## Requirements
- **Python 3.10+** ([Installation guide](https://www.python.org/downloads/))
- **Ollama** running locally ([Installation guide](https://ollama.com/download))
- **UV package manager** ([Installation guide](https://github.com/astral-sh/uv))
## Quick Start
**Option 1:** Install with pip and run
```bash
pip install --upgrade ollmcp
ollmcp
Option 2: One-step install and run using uvx
uvx ollmcp
Option 3: Install from source and run using a virtual environment
git clone https://github.com/jonigl/mcp-client-for-ollama.git
cd mcp-client-for-ollama
uv venv && source .venv/bin/activate
uv pip install .
uv run -m mcp_client_for_ollama
Run with default settings:
ollmcp
If you don't provide any options, the client will use
auto-discovery
mode to find MCP servers from Claude's configuration.
[!TIP]
The CLI now usesTyper
for a modern experience: grouped options, rich help, and built-in shell autocompletion. Advanced users can use short flags for faster commands. To enable autocompletion, run:ollmcp --install-completion
Then restart your shell or follow the printed instructions.
--mcp-server
, -s
: Path to one or more MCP server scripts (.py or .js). Can be specified multiple times to connect to multiple local servers.--mcp-server-url
, -u
: URL to one or more SSE or Streamable HTTP MCP servers. Can be specified multiple times to connect to multiple remote servers.--servers-json
, -j
: Path to a JSON file containing server configurations. See Server Configuration Format for details.--auto-discovery
, -a
: Auto-discover servers from Claude's default config file (default behavior if nojonigl/mcp-client-for-ollama
April 23, 2025
August 8, 2025
Python