Dolphin MCP is a Python library and CLI tool for interacting with Model Context Protocol servers using natural language. It supports multiple LLMs and provides a conversational interface for accessing
A flexible Python library and CLI tool for interacting with Model Context Protocol (MCP) servers using any LLM model.
Dolphin MCP is a Python library and CLI tool that enables querying and interaction with MCP servers using natural language. It connects to multiple configured MCP servers, making their tools accessible to language models like OpenAI, Anthropic, Ollama, and LMStudio. It provides a conversational interface for accessing and manipulating data from these servers.
The project showcases connecting to multiple MCP servers, listing and calling their tools, using function calling for external data interaction, processing results, creating a reusable Python library, and building a CLI on top of the library.
dolphin-mcp-cli
command.Before installing Dolphin MCP, ensure you have the following prerequisites installed:
C:\sqlite
)sqlite3 --version
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
uv --version
brew install python
brew install sqlite
sqlite3 --version
brew install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
uv --version
sudo apt update
sudo apt install python3 python3-pip
sudo apt update
sudo apt install sqlite3
sqlite3 --version
curl -LsSf https://astral.sh/uv/install.sh | sh
uv --version
pip install dolphin-mcp
This will install both the library and the dolphin-mcp-cli
command-line tool.
git clone https://github.com/cognitivecomputations/dolphin-mcp.git
cd dolphin-mcp
pip install -e .
cp .env.example .env
Then edit the .env
file to add your OpenAI API key.python setup_db.py
This creates a sample SQLite database with dolphin information that you can use to test the system.The project uses two main configuration files:
.env
- Contains OpenAI API configuration:OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=gpt-4o
# OPENAI_BASE_URL=https://api.openai.com/v1 # Uncomment and modify if using a custom base url
mcp_config.json
- Defines MCP servers to connect to:{
"mcpServers": {
"server1": {
"command": "command-to-start-server",
"args": ["arg1", "arg2"],
"env": {
"ENV_VAR1": "value1",
"ENV_VAR2": "value2"
}
},
"server2": {
"command": "another-server-command",
"args": ["--option", "value"]
}
}
}
You can add as many MCP servers as you need, and the client will connect to all of them and make their tools available.Run the CLI command with your query as an argument:
dolphin-mcp-cli "Your query here"
Usage: dolphin-mcp-cli [--model <name>] [--quiet] [--config <file>] 'your question'
Options:
--model <name> Specify the model to use
--quiet Suppress intermediate output
--config <file> Specify a custom config file (default: mcp_config.json)
--help, -h Show this help message
You can also use Dolphin MCP as a library in your Python code:
import asyncio
from dolphin_mcp import run_interaction
async def main():
result = await run_interaction(
user_query="What dolphin species are endangered?",
model_name="gpt-4o", # Optional, will use default from config if not specified
config_path="mcp_config.json", # Optional, defaults to mcp_config.json
quiet_mode=False # Optional, defaults to False
)
print(result)
# Run the async function
asyncio.run(main())
You can still run the original script directly:
python dolphin_mcp.py "Your query here"
The tool will:
Examples will depend on the MCP servers you have configured. With the demo dolphin database:
dolphin-mcp-cli "What dolphin species are endangered?"
Or with your own custom MCP servers:
dolphin-mcp-cli "Query relevant to your configured servers"
You can also specify a model to use:
dolphin-mcp-cli --model gpt-4o "What are the evolutionary relationships between dolphin species?"
To use the LMStudio provider:
dolphin-mcp-cli --model qwen2.5-7b "What are the evolutionary relationships between dolphin species?"
For quieter output (suppressing intermediate results):
dolphin-mcp-cli --quiet "List all dolphin species in the Atlantic Ocean"
If you run setup_db.py
, it will create a sample SQLite database with information about dolphin species. This is provided as a demonstration of how the system works with a simple MCP server. The database includes:
This is just one example of what you can do with the Dolphin MCP client. You can connect it to any MCP server that provides tools for accessing different types of data or services.
All dependencies are automatically installed when you install the package using pip.
The package is organized into several modules:
dolphin_mcp/
- Main package directory__init__.py
- Package initialization and exportsclient.py
- Core MCPClient implementation and run_interaction functioncli.py
- Command-line interfaceutils.py
- Utility functions for configuration and argument parsingproviders/
- Provider-specific implementationsopenai.py
- OpenAI API integrationanthropic.py
- Anthropic API integrationollama.py
- Ollama API integrationlmstudio.py
- LMStudio SDK integrationrun_interaction
function.mcp_config.json
and connects to each configured MCP server.This modular architecture allows for great flexibility - you can add any MCP server that provides tools for accessing different data sources or services, and the client will automatically make those tools available to the language model. The provider-specific modules also make it easy to add support for additional language model providers in the future.
Contributions are welcome! Please feel free to submit a Pull Request.
[Add your license inf...
cognitivecomputations/dolphin-mcp
March 11, 2025
March 28, 2025
Python