This repository provides a CLI client for running LLM prompts and interacting with Model Context Protocol (MCP) servers. It allows users to leverage various LLM providers and MCP-compatible services from their terminal.
This CLI program facilitates LLM prompt execution and implements a Model Context Protocol (MCP) client, enabling interaction with MCP-compatible servers from the terminal. It serves as an alternative to Claude Desktop, supporting various LLM providers like OpenAI, Groq, and local models via llama.cpp.
The CLI accepts text input directly or via pipes. It supports image analysis with multimodal LLMs. Predefined prompt templates can be used with the p
prefix. Tools can be triggered, requiring confirmation by default, which can be bypassed with --no-confirmations
. The --no-intermediates
flag is available for scripting. Clipboard content can be used with the cb
command, supporting text and images. Additional options include listing tools/prompts, disabling tools, forcing cache refresh, and overriding the model.
adhikasp/mcp-client-cli
November 27, 2024
March 28, 2025
Python