mcp-kibela-server

This repository provides an MCP server for Kibela API integration, enabling Large Language Models to search notes, retrieve recent notes, and access note content with comments, facilitating interaction with Kibela content.

4
2

Kibela MCP Server

smithery badge

This repository provides an MCP (Model Context Protocol) server implementation designed to integrate LLMs (Large Language Models) with the Kibela API. This integration allows LLMs to interact with and retrieve content from Kibela, a knowledge-sharing platform.

<img width="320" alt="Example" src="https://github.com/user-attachments/assets/eeed8f45-eb24-456d-bb70-9e738aa1bfb3" />

<a href="https://glama.ai/mcp/servers/m21nkeig1p"><img width="380" height="200" src="https://glama.ai/mcp/servers/m21nkeig1p/badge" alt="Kibela Server MCP server" /></a>

Features

  • Search notes using a query.
  • Retrieve your latest notes.
  • Obtain note content and associated comments.

Configuration

The server requires KIBELA_TEAM and KIBELA_TOKEN environment variables. Configuration examples are provided for Claude Desktop and Cursor, detailing how to integrate the server by specifying the command, arguments, and environment variables within their respective configuration files. For Cursor integration using SSE, the server URL should be set to http://localhost:3000/sse.

Tools

The server exposes the following tools: kibela_search_notes (search Kibela notes), kibela_get_my_notes (get latest notes), and kibela_get_note_content (get note content and comments). Each tool's input parameters and return values are documented.

Repository

KI
kiwamizamurai

kiwamizamurai/mcp-kibela-server

Created

February 1, 2025

Updated

February 11, 2025

Language

TypeScript

Category

Search & Knowledge