This repository implements an MCP server for ZenML, enabling access to live information about ZenML resources like pipelines, artifacts, and users, and allowing triggering new pipeline runs. This repository implements
This project provides a Model Context Protocol (MCP) server, enabling interaction between applications and the ZenML API. MCP standardizes how applications provide context to Large Language Models (LLMs).
The server offers MCP tools to access core read functionalities from the ZenML server, providing live information about users, stacks, pipelines, runs, steps, services, stack components, flavors, run templates, schedules, artifacts (metadata), service connectors, step code, and logs (for cloud-based steps). It also allows triggering new pipeline runs using run templates.
To use, a ZenML Cloud server and uv
are required. The MCP config file, a JSON format, specifies connection details, including the path to uv
, the zenml_server.py
file path, the ZenML server URL, and the API key. Instructions are provided for integrating with Claude Desktop and Cursor, detailing configuration steps and usage.
zenml-io/mcp-zenml
February 22, 2025
March 26, 2025
Python