This repository standardizes LLM interaction with MCP servers, which expose tools, execute functions, provide static content, and offer preset prompts through a unified framework. This repository standardizes LLM interaction with
Model Context Protocol (MCP) standardizes how applications provide context to LLMs, offering a unified framework for connecting to data sources, using tools, and executing prompts. The MCP ecosystem includes MCP Servers (managing tool availability, execution, static content, and prompts), Clients (managing server connections, LLM integration, and message passing), and Hosts (providing frontend interfaces and integration points).
MCP servers expose standardized capabilities through well-defined interfaces. They feature tools (functions LLMs invoke, defined by name, description, and input schema), resources (data sources identified by URIs), and prompts (reusable templates for interaction patterns).
This example implements a knowledgebase chatbot flow with tools for querying a vector database, user-selectable resources for context, and standard prompts for analytical workflows. The implementation is in mcp_server.py
with a CLI client in client.py
.
To set up, clone the repo, create a ChromaDB database using MCP_setup.ipynb
, create a virtual environment, install dependencies using uv sync
, and run the client and server with python client.py mcp_server.py
.
ALucek/quick-mcp-example
March 4, 2025
March 28, 2025
Python