mcp-openai

This repository provides a minimal MCP client library, enabling LLM UIs to support the Model Context Protocol through an OpenAI-compatible API. It facilitates integration with locally runnable inference engines.

41
2
<div align="center">
  <h1>𝔐&nbsp;&nbsp;mpc-openai&nbsp;&nbsp;⌘œ§</h1>
  <p><em>MCP Client with OpenAI compatible API</em></p>
</div>

This project provides an MCP (Model Context Protocol) client designed to integrate with LLMs (Large Language Models) via an OpenAI-compatible API. MCP standardizes how applications provide context to LLMs, similar to a USB-C port for AI.

**Warning:** This is a toy project intended as a reference for minimal MCP client development, and support is not planned.

The client is designed as a library for building LLM UIs that support MCP, enabling the use of locally runnable inference engines like vLLM, Ollama, TGI, llama.cpp, and LMStudio. These engines can then leverage OpenAI API features like text generation and function calling.

The README provides instructions on using `uv` for dependency management and environment setup. It details how to create an MCP client with custom configurations for MCP servers, LLM clients, and LLM request parameters. The client facilitates establishing connections with MCP servers and processing messages between the UI and the LLM, enabling the LLM to utilize tools provided by connected servers.

Repository

S1
S1M0N38

S1M0N38/mcp-openai

Created

December 25, 2024

Updated

March 27, 2025

Language

Python

Category

AI