MCP-Chinese-Getting-Started-Guide

This repository provides a quick start guide for Model Context Protocol (MCP) programming, focusing on building MCP servers with tools for large language models. It uses Python and the `uv` package manager.

1,291
11

Model Context Protocol(MCP) Programming Quick Start

[TOC]

Introduction

The Model Context Protocol (MCP) is an innovative open-source protocol that redefines how large language models (LLM) interact with the external world. MCP provides a standardized method for any LLM to easily connect to various data sources and tools, enabling seamless information access and processing. It acts as a USB-C interface for AI applications, offering a standard way for AI models to connect to different data sources and tools.

MCP has several core functionalities: Resources, Prompts, Tools, Sampling, Roots, and Transports. This document focuses on MCP server services and general-purpose LLMs, emphasizing "Tools." The transport layer supports stdio (standard input/output) and SSE (Server-Sent Events) protocols, with stdio being the primary example.

This guide uses Python 3.11 and uv for project management. Code examples are available on GitHub.

Developing an MCP Server

This section demonstrates implementing a web search server using uv to initialize the project and install dependencies like mcp[cli], httpx, and openai. The FastMCP object simplifies server creation. The @mcp.tool() decorator exposes functions as tools with automatic parameter injection and documentation. An example uses the ZhiPu AI API for web searches, returning a summary of search results.

Debugging MCP Servers

The official Inspector tool, run via npx or mcp dev, allows debugging the server. After running the server, connecting via the Inspector interface enables testing the implemented tools.

Developing MCP Clients

The guide explains how to invoke tools from an MCP server using Python. It demonstrates connecting to a server via stdio_client and ClientSession, listing available tools, and calling them with parameters. An example shows how to integrate the MCP server with the DeepSeek LLM, using system prompts to guide the model to use the web search tool.

Sampling Explanation

MCP's Sampling feature provides hooks before and after tool execution. This allows for actions like user confirmation before deleting files. The guide demonstrates creating a sampling callback function that prompts the user for confirmation before executing a file deletion tool.

Claude Desktop Loading MCP Server

This section explains how to load custom MCP servers into the Claude desktop application by modifying the claude_desktop_config.json file.

Other Functions

Prompt

MCP offers a prompt template generation feature using the @app.prompt decorator.

Resource

MCP allows selecting predefined resources for users in the Claude client, supporting custom protocols.

Using MCP Servers in LangChain

LangChain's langchain-mcp-adapters project facilitates integrating MCP servers into LangChain.

DeepSeek + cline + Custom MCP = Text-to-Image Master

This section demonstrates building a text-to-image application using DeepSeek, the cline plugin for VS Code, and a custom MCP server for image generation.

Repository

LI
liaokongVFX

liaokongVFX/MCP-Chinese-Getting-Started-Guide

Created

March 1, 2025

Updated

March 28, 2025

Category

AI