Newcontext-mode—Save 98% of your AI coding agent's context windowLearn more
MCP Directory
ServersClientsBlog

context-mode

Save 98% of your AI coding agent's context window. Works with Claude Code, Cursor, Copilot, Codex, and more.

Try context-mode
MCP Directory

Model Context Protocol Directory

MKSF LTD
Suite 8805 5 Brayford Square
London, E1 0SG

MCP Directory

  • About
  • Blog
  • Documentation
  • Contact

Menu

  • Servers
  • Clients

© 2026 model-context-protocol.com

The Model Context Protocol (MCP) is an open standard for AI model communication.
Powered by Mert KoseogluSoftware Forge
  1. Home
  2. Servers
  3. memtomem

memtomem

GitHub
Website

Markdown-first, long-term memory infrastructure for AI agents. Hybrid BM25 + semantic search across markdown/code files via MCP.

2
0

memtomem

PyPI
Python 3.12+
License: Apache 2.0
CLA

Give your AI agent a long-term memory.

memtomem turns your markdown notes, documents, and code into a searchable knowledge base that any AI coding agent can use. Write notes as plain .md files — memtomem indexes them and makes them searchable by both keywords and meaning.

flowchart LR
    A["Your files\n.md .json .py"] -->|Index| B["memtomem"]
    B -->|Search| C["AI agent\n(Claude Code, Cursor, etc.)"]

First time here? Follow the Getting Started guide — you'll have a working setup in under 5 minutes.


Why memtomem?

ProblemHow memtomem solves it
AI forgets everything between sessionsIndex your notes once, search them in every session
Keyword search misses related contentHybrid search: exact keywords + meaning-based similarity
Notes scattered across toolsOne searchable index for markdown, JSON, YAML, Python, JS/TS
Vendor lock-inYour .md files are the source of truth. The DB is a rebuildable cache

Quick Start

1. Install

ollama pull nomic-embed-text          # local embeddings (~270MB, free)
uv tool install memtomem             # or: pipx install memtomem

No GPU? Pick OpenAI in the wizard — see Embeddings.

2. Setup

mm init                               # 8-step wizard (or: mm init -y for CI)

The wizard picks your embedding model, points at the folder you want indexed, and registers memtomem with your AI editor.

3. Use

"Call the mem_status tool"   →  confirms the server is connected
"Index my notes folder"      →  mem_index(path="~/notes")
"Search for deployment"      →  mem_search(query="deployment checklist")
"Remember this insight"      →  mem_add(content="...", tags=["ops"])
<details> <summary><b>Other install options</b></summary>

Project-scoped (per-project isolation):

uv add memtomem && uv run mm init    # all commands need `uv run` prefix

No install (uvx on demand):

claude mcp add memtomem -s user -- uvx --from memtomem memtomem-server

See MCP Client Setup for Cursor / Windsurf / Claude Desktop / Gemini CLI.

</details>

Key Features

  • Hybrid search — BM25 keyword + dense vector + RRF fusion in one query
  • Semantic chunking — heading-aware Markdown, AST-based Python, tree-sitter JS/TS, structure-aware JSON/YAML/TOML
  • Incremental indexing — chunk-level SHA-256 diff; only changed chunks get re-embedded
  • Namespaces — organize memories into scoped groups with auto-derivation from folder names
  • Maintenance — near-duplicate detection, time-based decay, TTL expiration, auto-tagging
  • Web UI — visual dashboard for search, sources, tags, sessions, health monitoring
  • MCP tools — mem_do meta-tool routes all non-core actions in core mode for minimal context usage

Ecosystem

PackageDescription
memtomemCore — MCP server, CLI, Web UI, hybrid search, storage
memtomem-stmSTM proxy — proactive memory surfacing via tool interception

Documentation

GuideDescription
Getting StartedInstall, setup wizard, first use
Hands-On TutorialFollow-along with example files
Interactive NotebooksJupyter notebooks for the Python API — hello, indexing, sessions, tuning, LangGraph
User GuideComplete feature walkthrough
ConfigurationAll MEMTOMEM_* environment variables
EmbeddingsONNX, Ollama, and OpenAI embedding providers
LLM ProvidersOllama, OpenAI, Anthropic, and compatible endpoints
MCP Client SetupEditor-specific configuration
Agent Memory GuideSessions, working memory, procedures
Web UIVisual dashboard
HooksClaude Code hooks for auto-indexing

Contributing

See CONTRIBUTING.md for setup instructions and the contributor guide.

License

Apache License 2.0. Contributions are accepted under the terms of the Contributor License Agreement.

Repository

ME
memtomem

memtomem/memtomem

Created

March 28, 2026

Updated

April 13, 2026

Language

Python

Category

AI