The Tavily MCP Server provides AI-powered web search capabilities for LLMs using Tavily's API, enabling sophisticated searches, direct answers, and recent news retrieval with AI-extracted content. The Tavily MCP Server
This repository hosts a Model Context Protocol (MCP) server that leverages the Tavily search API to provide AI-powered web search capabilities for Large Language Models (LLMs). It enables LLMs to perform web searches, obtain direct answers, and search recent news with AI-extracted content.
The server offers three primary tools: tavily_web_search
for general web searches, tavily_answer_search
for direct answers with supporting evidence, and tavily_news_search
for recent news articles. Each tool accepts a search query and optional parameters like max_results
, search_depth
, and domain filters. Prompt templates are also provided for each search type.
The server requires Python 3.11+, a Tavily API key, and uv
(recommended). Installation can be done via pip
or uv
, or directly from the source code. The README provides detailed instructions for each method, including setting up a virtual environment and installing dependencies.
Configuration involves setting up the Tavily API key through a .env
file, environment variable, or command-line argument. The README also provides configuration instructions for Claude.app. Usage examples demonstrate how to perform web searches, generate reports with domain filtering, obtain direct answers, and search for news.
The project includes a test suite that can be run using uv sync --dev
and ./tests/run_tests.sh
. Contributions are welcome via forking, creating a feature branch, and submitting a pull request. The project is licensed under the MIT License.
RamXX/mcp-tavily
December 1, 2024
March 26, 2025
Python