mcp-client-langchain-py

This repository provides a Python-based Model Context Protocol (MCP) client using LangChain, enabling interaction with MCP servers through LangChain ReAct Agent and supporting LLMs from Anthropic, OpenAI, and Groq.

9
2

MCP Client Using LangChain / Python

This project is a Model Context Protocol (MCP) client demonstrating the use of MCP server tools with a LangChain ReAct Agent. It uses convert_mcp_to_langchain_tools() from the langchain_mcp_tools library to initialize multiple MCP servers and convert their tools into a list of LangChain-compatible tools. It supports LLMs from Anthropic, OpenAI, and Groq.

Prerequisites: Python 3.11+, optional uv/uvx and npm/npx, and API keys from Anthropic, OpenAI, and/or Groq.

Setup: Install dependencies using make install, configure API keys in .env (copied from .env.template), and configure LLM and MCP server settings in llm_mcp_config.json5. The configuration file format is similar to Claude for Desktop, using mcp_servers instead of mcpServers, and supports JSON5 with environment variable substitution using ${...}.

Usage: Run the app with make start (verbose mode: make start-v, help: make start-h). Example queries, configurable in llm_mcp_config.json5, are available by pressing Enter at the prompt.

Repository

HI
hideya

hideya/mcp-client-langchain-py

Created

January 12, 2025

Updated

March 27, 2025

Language

Python

Category

AI