This repository provides a cross-platform desktop application for interacting with LLMs using the MCP protocol. It offers a clean, minimalistic codebase for understanding MCP principles and testing multiple servers and LLMs.
This desktop application leverages the Model Context Protocol (MCP) to facilitate seamless interaction with various Large Language Models (LLMs). Built with Electron, it ensures cross-platform compatibility across Linux, macOS, and Windows.
The project aims to provide a clean, minimalistic codebase for understanding MCP principles and efficiently testing multiple servers and LLMs, making it ideal for developers and researchers.
The architecture, consistent with MCP documentation, includes Renderer, APP, Client, and Server components, communicating via IPC and Stdio. Key files are main.ts
, client.ts
, and preload.ts
.
config.json
in src/main
with valid command
and path
in args
.node -v
, npm -v
).npm install
npm start
Create a .json
file and paste the following content into it. This file can then be provided as the interface configuration for the Chat UI.
gtp-api.json
{
"chatbotStore": {
"apiKey": "",
"url": "https://api.aiql.com",
"path": "/v1/chat/completions",
"model": "gpt-4o-mini",
"max_tokens_value": "",
"mcp": true
},
"defaultChoiceStore": {
"model": [
"gpt-4o-mini",
"gpt-4o",
"gpt-4",
"gpt-4-turbo"
]
}
}
You can replace the 'url' if you have direct access to the OpenAI API.
Alternatively, you can also use another API endpoint that supports function calls:
qwen-api.json
{
"chatbotStore": {
"apiKey": "",
"url": "https://dashscope.aliyuncs.com/compatible-mode",
"path": "/v1/chat/completions",
"model": "qwen-turbo",
"max_tokens_value": "",
"mcp": true
},
"defaultChoiceStore": {
"model": [
"qwen-turbo",
"qwen-plus",
"qwen-max"
]
}
}
deepinfra.json
{
"chatbotStore": {
"apiKey": "",
"url": "https://api.deepinfra.com",
"path": "/v1/openai/chat/completions",
"model": "meta-llama/Meta-Llama-3.1-70B-Instruct",
"max_tokens_value": "32000",
"mcp": true
},
"defaultChoiceStore": {
"model": [
"meta-llama/Meta-Llama-3.1-70B-Instruct",
"meta-llama/Meta-Llama-3.1-405B-Instruct",
"meta-llama/Meta-Llama-3.1-8B-Instruct"
]
}
}
npm run build-app
Builds and packages the application for the current OS, storing artifacts in /artifacts
. Debian/Ubuntu users may need to install rpm
or skip the RPM build step.
Addresses common issues like spawn npx ENOENT
(ISSUE 40) and installation/Electron builder timeouts, providing workarounds and solutions.
Presents visual demonstrations of multimodal support, reasoning with LaTeX, MCP tools visualization, toolcall process overview, prompts template, dynamic LLM config, and DevTool troubleshooting.
AI-QL/chat-mcp
December 1, 2024
March 28, 2025
HTML