MCP Directory
ServersClientsBlog
MCP Directory

Model Context Protocol Directory

MKSF LTD
Suite 8805 5 Brayford Square
London, E1 0SG

MCP Directory

  • About
  • Blog
  • Documentation
  • Contact

Menu

  • Servers
  • Clients

© 2025 model-context-protocol.com

The Model Context Protocol (MCP) is an open standard for AI model communication.
Powered by Mert KoseogluSoftware Forge
  1. Home
  2. Clients
  3. ultra-mcp

ultra-mcp

GitHub

**MCP client for enhanced coding tools. Boost Claude, Gemini, Cursor & more with context-aware code generation.**

217
9
# Ultra MCP: Model Context Protocol Server

> **All Models. One Interface. Zero Friction.**

[![npm version](https://badge.fury.io/js/ultra-mcp.svg)](https://badge.fury.io/js/ultra-mcp)
[![npm downloads](https://img.shields.io/npm/dm/ultra-mcp.svg)](https://www.npmjs.com/package/ultra-mcp)

🚀 **Ultra MCP** is a Model Context Protocol (MCP) server designed to streamline your AI-powered coding workflows. It exposes OpenAI, Gemini, Azure OpenAI, and xAI Grok AI models through a single, unified MCP interface, making it ideal for use with Claude Code and Cursor.  Ultra MCP simplifies interacting with diverse AI models, eliminating the need to manage multiple API integrations.

![Ultra MCP Dashboard](https://github.com/user-attachments/assets/b2ade474-7c68-458c-84e4-daa73e32ad8c)

> Stop wasting time having meetings with humans. Now it's time to ask AI models to do this.

## Inspiration

This project draws inspiration from:

- **[Agent2Agent (A2A)](https://developers.googleblog.com/en/a2a-a-new-era-of-agent-interoperability/)** by Google:  A pioneering effort in agent-to-agent communication protocols.
- **[Zen MCP](https://github.com/BeehiveInnovations/zen-mcp-server)**: An AI orchestration server enabling Claude to collaborate with multiple AI models.

## Why Ultra MCP?

Ultra MCP builds upon the foundation of projects like Zen MCP, offering several key advantages:

### 🚀 Ease of Use

- **Direct Execution:** No cloning required. Start immediately with `npx ultra-mcp`.
- **Global Installation:** Install globally via npm: `npm install -g ultra-mcp`.
- **Interactive Configuration:** Guided setup with `npx ultra-mcp config`.
- **Minimal Friction:**  Go from zero to AI-powered coding in under a minute.

### 📊 Built-in Usage Analytics

- **Local Data Storage:** All usage data is stored locally using a SQLite database (libSQL).
- **Automatic Tracking:**  Every LLM request is tracked, including token counts and costs.
- **Usage Statistics:**  View your AI usage with `npx ultra-mcp db:stats`.
- **Privacy Focused:** Your data remains on your machine.

### 🌐 Modern Web Dashboard

- **React UI:**  A visually appealing dashboard built with React and Tailwind CSS.
- **Real-time Statistics:**  Monitor usage trends, costs per provider, and model distribution.
- **Easy Access:**  Launch the dashboard with `npx ultra-mcp dashboard`.
- **Configuration UI:**  Manage API keys and model priorities directly from the web interface.

### 🔧 Additional Benefits

- **Simplified Tools:**  Tools are designed with a maximum of 4 parameters for ease of use (compared to Zen MCP's 10-15).
- **Smart Defaults:**  Optimal model selection is configured out-of-the-box.
- **TypeScript First:**  Ensures type safety and a better developer experience.
- **Regular Updates:**  Active development with weekly feature additions.

## Key Features

- 🤖 **Multi-Model Support:** Integrates OpenAI (O3), Google Gemini (2.5 Pro), Azure OpenAI, and xAI Grok models.
- 🔌 **MCP Protocol Compliance:** Adheres to the standard Model Context Protocol interface.
- 🧠 **Deep Reasoning Tools:** Access powerful models for complex problem-solving.
- 🔍 **Investigation & Research:** Built-in tools for in-depth investigation and research tasks.
- 🌐 **Google Search Integration:** Gemini 2.5 Pro leverages real-time web search capabilities.
- ⚡ **Real-time Streaming:** Live model responses via Vercel AI SDK.
- 🔧 **Zero Config Option:** Interactive setup with intelligent defaults.
- 🔑 **Secure Configuration:** Local API key storage using the `conf` library.
- 🧪 **TypeScript:** Full type safety and a modern development environment.

## Quick Start

### Installation

```bash
# Install globally via npm
npm install -g ultra-mcp

# Or run directly with npx (installs if needed)
npx -y ultra-mcp config

Configuration

Set up your API keys interactively:

npx -y ultra-mcp config

This command will:

  1. Display the current configuration status.
  2. Present a provider-first menu to select the AI provider you want to configure.
  3. Guide you through setting API keys, base URLs (if needed), and preferred models.
  4. Store the configuration securely on your system.
  5. Automatically load settings when the server starts.

New in v0.5.10:

  • 🎯 Provider-First Configuration: Select a specific provider to configure.
  • 🤖 OpenAI-Compatible Support: Configure Ollama (local) or OpenRouter (400+ models).
  • 📋 Model Selection: Choose your preferred model from categorized lists.

Running the Server

# Run the MCP server
npx -y ultra-mcp

# Or, after building locally:
npm run build
node dist/cli.js

CLI Commands

Ultra MCP provides a suite of command-line tools:

config - Interactive Configuration

npx -y ultra-mcp config

Configure API keys interactively through a user-friendly menu.

dashboard - Web Dashboard

npx -y ultra-mcp dashboard

# Custom port
npx -y ultra-mcp dashboard --port 4000

# Development mode
npx -y ultra-mcp dashboard --dev

Launch the web dashboard to view usage statistics, manage configurations, and monitor AI costs.

install - Install for Claude Code

npx -y ultra-mcp install

Automatically install Ultra MCP as an MCP server for Claude Code.

doctor - Health Check

npx -y ultra-mcp doctor

# Test connections to providers
npx -y ultra-mcp doctor --test

Check the installation health and test API connections to ensure everything is working correctly.

chat - Interactive Chat

npx -y ultra-mcp chat

# Specify model and provider
npx -y ultra-mcp chat -m o3 -p openai
npx -y ultra-mcp chat -m grok-4 -p grok

Chat interactively with AI models directly from the command line. -m specifies the model, and -p specifies the provider.

Database Commands

db:show - Show Database Info

npx -y ultra-mcp db:show

Display the database file location and basic statistics.

db:stats - Usage Statistics

npx -y ultra-mcp db:stats

Show detailed usage statistics for the last 30 days, including costs by provider.

db:view - Database Viewer

npx -y ultra-mcp db:view

Launch Drizzle Studio to explore the usage database interactively.

Integration with Claude Code

Automatic Installation (Recommended)

# Install Ultra MCP for Claude Code
npx -y ultra-mcp install

This command will:

  • Detect your Claude Code installation.
  • Add Ultra MCP as an MCP server.
  • Configure it for user or project scope.
  • Verify API key configuration.

Manual Installation

Add the following to your Claude Code settings:

{
  "mcpServers": {
    "ultra-mcp": {
      "command": "npx",
      "args": ["-y", "ultra-mcp@latest"]
    }
  }
}

Integration with Cursor

First, configure your API keys:

npx -y ultra-mcp config

Then, add the following to your Cursor MCP settings:

{
  "mcpServers": {
    "ultra-mcp": {
      "command": "npx",
      "args": ["-y", "ultra-mcp@latest"]
    }
  }
}

Ultra MCP will automatically use the API keys you configured with the config command.

MCP Tools

Ultra MCP provides a set of AI tools accessible through Claude Code and Cursor, conforming to the MCP standard:

🧠 Deep Reasoning (deep-reasoning)

Leverage advanced AI models for complex problem-solving and analysis.

  • Default Models: O3 for OpenAI/Azure, Gemini 2.5 Pro with Google Search, Grok-4 for xAI.
  • Use Cases: Complex algorithms, architectural decisions, deep analysis.

🔍 Investigate (investigate)

Thoroughly investigate topics with configurable depth levels.

  • Depth Levels: shallow, medium, deep.
  • Google Search: Enabled by default

Repository

RE
RealMikeChong

RealMikeChong/ultra-mcp

Created

June 28, 2025

Updated

August 8, 2025

Language

TypeScript

Category

AI