websocket_mcp

This repository offers a WebSocket-based Model Context Protocol (MCP) server and client implementation, including a local LLM chat bot demo, designed to improve upon existing MCP transport layers for efficient communication.

1
0

WebSocket MCP

This project offers a WebSocket-based implementation of the Model Context Protocol (MCP) for enhanced communication between clients and local Language Model (LLM) servers, overcoming limitations of standard transport layers. It includes both a server and a client, along with a local LLM chatbot demo.

Modules

  • MCP Server: A robust server for the Model Context Protocol, using WebSockets for efficient message exchange.
  • MCP Client: A generic client for interacting with MCP servers, supporting resource listing and data streaming.

Demo

  • Local LLM Server: Hosts the LLM, handles requests, and lists available models using the ollama package.
  • LLM Client: Connects to the server to send prompts and receive responses, configurable for remote server connections.

Usage

The server and clients are configured via command-line arguments, defaulting to local operation. The project supports various LLM models and configurations, offering flexibility and extensibility. The examples demonstrate how to run the server and client, showcasing prompt submission and response retrieval.

Repository

ZE
zeropointo

zeropointo/websocket_mcp

Created

February 24, 2025

Updated

February 24, 2025

Language

Python

Category

AI