MCP Directory
ServersClientsBlog
MCP Directory

Model Context Protocol Directory

MKSF LTD
Suite 8805 5 Brayford Square
London, E1 0SG

MCP Directory

  • About
  • Blog
  • Documentation
  • Contact

Menu

  • Servers
  • Clients

© 2025 model-context-protocol.com

The Model Context Protocol (MCP) is an open standard for AI model communication.
Powered by Mert KoseogluSoftware Forge
  1. Home
  2. Clients
  3. tome

tome

GitHub
Website

**Option 1 (Focus on LLM client):** LLM desktop client leveraging MCP for easy access & use of Large Language Models. **Option 2 (Focus on MCP integration):** MCP-integrated desktop client simplify

427
24
# Tome - Magical AI Spellbook

<p align="center">
    <img src="static/images/repo-header.png" alt="Tome" />
</p>

<p align="center">
    <code>A magical desktop app that lets you chat with local or remote LLMs, schedule tasks, all superpowered by MCP.</code>
</p>

<p align="center">
    <a href="https://discord.gg/9CH6us29YA" target="_blank"><img src="https://img.shields.io/discord/1365100902561742868?logo=discord&logoColor=fff&label=Join%20Us!&color=9D7CD8" alt="Join Us on Discord" /></a>
    <a href="https://opensource.org/licenses/Apache-2.0" target="_blank"><img src="https://img.shields.io/badge/License-Apache_2.0-blue.svg" alt="License: Apache 2.0" /></a>
    <a href="https://github.com/runebookai/tome/releases" target="_blank"><img src="https://img.shields.io/github/v/release/runebookai/tome" alt="GitHub Release" /></a>
</p>

<p align="center">
    🔮 Download the Tome Desktop App: <a href="https://github.com/runebookai/tome/releases/download/0.9.1/Tome_0.9.1_x64-setup.exe">Windows</a> | <a href="https://github.com/runebookai/tome/releases/download/0.9.1/Tome_0.9.1_aarch64.dmg">MacOS</a>
</p>

## Introduction

Tome is a desktop application designed to empower **anyone** to leverage the capabilities of Large Language Models (LLMs) through the **Model Context Protocol (MCP)**.  With Tome, you can connect to both local and remote LLMs and integrate them with a vast ecosystem of MCP servers, effectively creating your own personalized, AI-powered "spellbook."

**Key Concept: Model Context Protocol (MCP)**

MCP is a protocol that allows LLMs to access external tools and data sources. Think of it as a bridge connecting your LLM to the real world.  This enables your LLM to perform tasks such as:

*   Searching the web
*   Accessing your local file system
*   Interacting with APIs (e.g., Scryfall for Magic: The Gathering data, or Atlassian for project management)

Whether you prefer a fully private, local setup using Ollama and Qwen3 with local MCP servers, or want to harness the power of state-of-the-art cloud models with remote MCP servers, Tome provides the flexibility to tailor your experience.

**Important Note:** This is a Technical Preview. Expect some rough edges!  We encourage you to [join our Discord community](https://discord.gg/9CH6us29YA) to share feedback, tips, and report any issues.  Star this repository to stay informed about updates and new feature releases!

## Features

*   **Beginner-Friendly Experience:**
    *   Simple download and installation process.
    *   No need to configure JSON, Docker, Python, or Node.js.
    *   Start chatting with MCP-powered models within minutes.

*   **Scheduled Tasks (NEW!):**
    *   Automate prompts to run hourly or at specific times daily.
    *   Compatible with any model and MCP server combination.

*   **AI Model Support:**
    *   **Remote:** Google Gemini, OpenAI, any OpenAI API-compatible endpoint.
    *   **Local:** Ollama, LM Studio, Cortex, any OpenAI API-compatible endpoint.

*   **Enhanced MCP Support:**
    *   User interface (UI) for installing, removing, and enabling/disabling MCP servers.
    *   Out-of-the-box support for npm, uvx, Node.js, and Python-based MCP servers.

*   **Smithery.ai Integration:**
    *   Access thousands of MCP servers through one-click installation via the [Smithery.ai](https://smithery.ai) registry.

*   **Customization:**
    *   Adjust context window size and temperature settings.

*   **Native Tool Calling and Reasoning Model Support:**
    *   UI enhancements that clearly distinguish between tool calls and reasoning messages.

## Demo

[Demo Video](https://github.com/user-attachments/assets/0775d100-3eba-4219-9e2f-360a01f28cce)

## Getting Started

### Requirements

*   Operating System: MacOS or Windows (Linux support coming soon!)
*   LLM Provider:
    *   [Ollama](https://ollama.com/) (easy to set up and use local models)
    *   [Gemini API key](https://aistudio.google.com/app/apikey) (free tier available)
*   [Download the latest release of Tome](https://github.com/runebookai/tome/releases)

### Quickstart Guide

1.  **Install Tome:** Download and install the application from the [releases page](https://github.com/runebookai/tome/releases).
2.  **Connect Your LLM:** Configure your preferred LLM provider. OpenAI, Ollama, and Gemini are pre-configured.  To add other providers like LM Studio, use `http://localhost:1234/v1` as the API URL.
3.  **Install an MCP Server:** Navigate to the MCP tab within Tome and install your first MCP server.  A simple starting point is the "Fetch" server.  Paste `uvx mcp-server-fetch` into the server field to install it.  You can find more MCP servers [here](https://github.com/modelcontextprotocol/servers).
4.  **Chat with Your MCP-Powered Model:** Start interacting with your model! For example, ask it to "fetch the top story on Hacker News."

## Vision

Our vision is to democratize access to local LLMs and the power of MCP. We are building a tool that empowers creativity with LLMs, regardless of your technical background.

### Core Principles

*   **Local First:** You retain control over your data.
*   **Accessibility:** No need to manage programming languages, package managers, or complex configuration files.

## What's Next?

We appreciate the valuable feedback we've received since releasing Tome. We have ambitious plans to expand its capabilities and break LLMs out of the traditional chatbox paradigm.

*   **Scheduled Tasks:** Enable LLMs to perform helpful tasks even when you are not actively using the application.
*   **Native Integrations:** Enhance MCP server integrations to facilitate more powerful and unique interactions with LLMs.
*   **App Builder:** Provide tools to create custom applications and workflows beyond the chat interface.
*   **Community Input:** We value your feedback! Join our community to share your ideas and suggestions.

## Community

*   [Discord](https://discord.gg/9CH6us29YA)
*   [Blog](https://blog.runebook.ai)
*   [Bluesky](https://bsky.app/profile/gettome.app)
*   [Twitter](https://twitter.com/get_tome)

Repository

RU
runebookai

runebookai/tome

Created

April 25, 2025

Updated

August 7, 2025

Language

Svelte

Category

AI