This repository offers a CLI tool for interacting with Model Context Provider servers, supporting OpenAI and Ollama, and featuring chat and interactive modes with conversation history and server-aware tools.
This repository hosts a CLI tool for interacting with a Model Context Provider server, enabling users to send commands, query data, and manage resources.
gpt-4o-mini
default) and Ollama (qwen2.5-coder
default) providers and models.(Instructions for cloning the repository, installing UV, and synchronizing dependencies are provided.)
(Details on using --server
, --config-file
, --provider
, and --model
arguments are provided.)
(Details on using chat mode with examples, including specifying providers and models.)
(Explanation of natural language interaction and tool usage.)
(Explanation of reviewing, analyzing, saving, and filtering conversation history with command examples.)
(Explanation of tracking tool calls with command examples.)
(Explanation of changing provider and model during a chat session with command examples.)
(Lists available slash commands for general, tool, conversation, and other functions.)
(Details on using interactive mode with examples, including specifying providers and models.)
(Lists available slash commands for interactive mode.)
(Instructions for setting the OPENAI_API_KEY
environment variable.)
(Provides a directory structure overview.)
(Encourages contributions via issues and pull requests.)
(Specifies the MIT License.)
chrishayuk/mcp-cli
November 30, 2024
March 28, 2025
Python