mcp-llm-bridge

This repository provides a bridge connecting Model Context Protocol (MCP) servers to OpenAI-compatible LLMs, enabling the use of MCP-compliant tools with models like GPT-4o through a standardized interface.

296
33

MCP LLM Bridge

This bridge connects Model Context Protocol (MCP) servers to OpenAI-compatible LLMs, primarily supporting the OpenAI API. It also offers compatibility with local endpoints implementing the OpenAI API specification, such as Ollama and LM Studio.

The implementation translates bidirectionally between MCP and OpenAI's function-calling interface. It converts MCP tool specifications into OpenAI function schemas and maps function invocations back to MCP tool executions. This allows any OpenAI-compatible language model to use MCP-compliant tools through a standardized interface, whether cloud-based or local.

The bridge is configured via a BridgeConfig object, specifying MCP server parameters and LLM configuration, including API keys, model names, and base URLs.

To use the bridge, install the package, configure the .env file with your OpenAI API key, and run python -m mcp_llm_bridge.main. The bridge then allows you to query the LLM using natural language, leveraging the connected MCP tools.

Repository

BA
bartolli

bartolli/mcp-llm-bridge

Created

December 3, 2024

Updated

March 28, 2025

Language

Python

Category

AI