MCP-Bridge bridges the OpenAI API and MCP tools, enabling developers to use MCP tools through the OpenAI API interface. It supports chat completions, MCP tools, and offers an SSE Bridge
MCP-Bridge acts as a bridge between the OpenAI API and MCP tools, allowing developers to leverage MCP tools through the OpenAI API interface. It facilitates the integration of MCP tools with the OpenAI API by providing endpoints compatible with the OpenAI API. This allows any client to use any MCP tool without explicit support for MCP.
Current features include non-streaming and streaming chat completions with MCP, non-streaming completions without MCP, MCP tools, and MCP sampling. Planned features include streaming completions and MCP resource support.
Installation is recommended via Docker, requiring an inference engine with tool call support like vLLM or Ollama. Configuration involves editing the config.json
file to specify inference server details, MCP servers, and API keys for authentication.
MCP-Bridge exposes REST API endpoints for interacting with MCP primitives and provides an SSE bridge for external clients. It supports API key authentication for security. Contributions are welcome under the MIT License.
SecretiveShell/MCP-Bridge
November 30, 2024
March 28, 2025
Python