The MCP Code Executor is an MCP server that enables LLMs to execute Python code within a specified Conda environment, allowing them to leverage libraries and dependencies for enhanced functionality.
The MCP Code Executor is an MCP server designed to enable Large Language Models (LLMs) to execute Python code within a designated Conda environment. This setup grants LLMs access to specific libraries and dependencies defined within that environment.
<a href="https://glama.ai/mcp/servers/45ix8xode3"><img width="380" height="200" src="https://glama.ai/mcp/servers/45ix8xode3/badge" alt="Code Executor MCP server" /></a>
Configure the server by adding a configuration block to your MCP servers configuration file, specifying the command to execute the server (node
), arguments including the path to the built index.js
file, and environment variables for the code storage directory (CODE_STORAGE_DIR
) and Conda environment name (CONDA_ENV_NAME
).
Once configured, LLMs can execute Python code by generating a file in the specified CODE_STORAGE_DIR
, which is then run within the designated Conda environment. LLMs reference this MCP server in their prompts to trigger code generation and execution.
Contributions are welcome through issues and pull requests.
This project is licensed under the MIT License.
bazinga012/mcp_code_executor
February 6, 2025
March 28, 2025
JavaScript