A modular MCP server suite for performance‑testing workflows: generate JMeter scripts from browser automation, manage and run load tests, ingest and correlate metrics from APM tools using temporal correlation analysis, and produce customizable performance reports—all built for seamless integration and scalability across your testing ecosystem.
⚠️ This project is under active development. Features, modules, and documentation may change frequently. Use at your own risk and please report any issues or suggestions!
Welcome to the MCP Perf Suite — a modular collection of MCP servers designed to support and streamline performance testing workflows.
This repository hosts multiple MCP servers, each designed for a specific role in the performance testing lifecycle:
The MCP servers in this repository (and external integrations like Playwright MCP) form a complete performance testing pipeline. This workflow illustrates how scripts are created, validated, executed, monitored, analyzed, and finally reported and shared across teams.
┌────────────────────────┐
│ Playwright MCP │
│ (external, captures │
│ browser traffic) │
└───────────┬────────────┘
│ JSON traffic
▼
┌────────────────────────┐ ┌─────────────────────────┐
│ JMeter MCP Server │◄─────►│ PerfMemory MCP Server │
│ - Generate JMX scripts│ │ - Recall past fixes │
│ - Run smoke tests to │ │ - Store new lessons │
│ validate correctness│ │ - Vector similarity │
└───────────┬────────────┘ │ search (pgvector) │
│ Validated JMX │ - Knowledge graph │
▼ │ (Apache AGE) │
┌────────────────────────┐ └─────────────────────────┘
│ BlazeMeter MCP Server│
│ - Execute full-scale │
│ performance tests │
│ - Fetch run results │
└───────────┬────────────┘
│ Results & metrics
▼
┌────────────────────────────────┐
│ Datadog MCP Server │
│ (APM metrics correlation) │
└───────────┬────────────────────┘
│
▼
┌────────────────────────────────┐
│ Performance Test Analysis MCP │
│ - Analyze BlazeMeter results │
│ - Analyze Datadog metrics │
│ - Log analysis (JMeter + │
│ Datadog logs) │
│ - Time-series correlation │
└───────────┬────────────────────┘
│
▼
┌────────────────────────────────┐
│ Performance Reporting MCP │
│ (PDF, Word, Markdown reports) │
└───────────┬────────────────────┘
│
▼
┌────────────────────────────────┐
│ Confluence MCP Server │
│ (Publish reports to Confluence)│
└────────────────────────────────┘Each MCP server lives in its own subdirectory within this repo, making it easy to develop, maintain, and deploy independently:
mcp-perf-suite/
├── artifacts/ # Folder that contains the performance test results
├── blazemeter-mcp/ # BlazeMeter MCP server (current)
├── confluence-mcp/ # Confluence MCP server (current)
├── datadog-mcp/ # Datadog MCP server (current)
├── docker/ # Dockerfiles and Compose files (pgvector + Apache AGE)
├── jmeter-mcp/ # JMeter MCP server (current)
├── perfanalysis-mcp/ # LLM-powered test analysis MCP (current)
├── perfmemory-mcp/ # AI memory & lessons learned MCP (current)
├── perfreport-mcp/ # Reporting and formatting MCP (current)
├── README.md # This file: repo overview and guidance
└── LICENSE # Repository license (e.g., MIT)
All MCP servers use FastMCP and Python 3.12+. Each server has its own README with detailed setup instructions, configuration, and tool reference. Navigate to the server's folder and follow its README to get started.
| MCP Server | Folder | README | Prerequisites |
|---|---|---|---|
| JMeter | jmeter-mcp/ | README | JMeter 5.6+, Playwright MCP (optional) |
| BlazeMeter | blazemeter-mcp/ | README | BlazeMeter API key |
| Datadog | datadog-mcp/ | README | Datadog API + App keys |
| Performance Analysis | perfanalysis-mcp/ | README | BlazeMeter or JMeter test artifacts |
| PerfMemory | perfmemory-mcp/ | README | PostgreSQL + pgvector + Apache AGE (setup guide), embedding API key |
| Performance Report | perfreport-mcp/ | README | Analysis artifacts |
| Confluence | confluence-mcp/ | README | Confluence token (cloud or on-prem) |
Common setup steps:
.env.example to .env and fill in your credentialsconfig.example.yaml to config.yaml and adjust settings as neededpip install -e . (or use pyproject.toml)mcp.jsonFor Docker-based dependencies (e.g., PerfMemory's PostgreSQL with pgvector + Apache AGE), see docker/docker-compose-windows.yaml or docker/docker-compose-mac.yaml.
The MCP Perf Suite is evolving toward a schema-driven architecture that enables true modularity and extensibility. The core principle: standardized data contracts between MCPs ensure that adding new data sources doesn't require changes to downstream consumers.
┌─────────────────────────────────────────────────────────────────────────────┐
│ DATA SOURCES │
├─────────────────────────────────┬───────────────────────────────────────────┤
│ APM MCP │ Load Test MCP │
│ (replaces Datadog MCP) │ (replaces BlazeMeter MCP) │
│ │ │
│ ┌─────────────────────────┐ │ ┌─────────────────────────┐ │
│ │ Datadog Adapter │ │ │ BlazeMeter Adapter │ │
│ │ New Relic Adapter │ │ │ LoadRunner Adapter │ │
│ │ AppDynamics Adapter │ │ │ Gatling Adapter │ │
│ │ Dynatrace Adapter │ │ │ k6 Adapter │ │
│ │ Splunk APM Adapter │ │ │ Locust Adapter │ │
│ └──────────┬──────────────┘ │ └──────────┬──────────────┘ │
│ │ │ │ │
│ ▼ │ ▼ │
│ ┌─────────────────────────┐ │ ┌─────────────────────────┐ │
│ │ Standardized APM │ │ │ Standardized Load Test │ │
│ │ Output Schema │ │ │ Output Schema │ │
│ │ (metrics, logs, traces)│ │ │ (results, aggregates) │ │
│ └──────────┬──────────────┘ │ └──────────┬──────────────┘ │
├─────────────┴───────────────────┴───────────────┴───────────────────────────┤
│ │
│ STANDARDIZED SCHEMA LAYER │
│ (Source-agnostic data contracts / JSON & CSV schemas) │
│ │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────────────────┐ │
│ │ Performance Analysis │ │
│ │ MCP │ │
│ │ (source-agnostic) │ │
│ └───────────┬─────────────┘ │
│ │ │
│ ▼ │
│ ┌─────────────────────────┐ │
│ │ Performance Report │ │
│ │ MCP │ │
│ │ (source-agnostic) │ │
│ └───────────┬─────────────┘ │
│ │ │
│ ┌────────────────┼────────────────┐ │
│ ▼ ▼ ▼ │
│ ┌──────────┐ ┌──────────┐ ┌──────────────┐ │
│ │Confluence│ │ MS Graph │ │ Other Output │ │
│ │ MCP │ │ MCP │ │ Adapters │ │
│ └──────────┘ └──────────┘ └──────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘| Benefit | Description |
|---|---|
| Extensibility | Add new APM tools or load test platforms by implementing an adapter that outputs the standard schema |
| Loose Coupling | PerfAnalysis and PerfReport MCPs remain unchanged when new data sources are added |
| Community Contributions | Clear schema contracts make it easy for contributors to add support for their preferred tools |
| Maintainability | Changes to source APIs (e.g., Datadog v3) only affect their respective adapter, not the entire pipeline |
Contributions, ideas, and feature requests are welcome! Please open issues or create pull requests to collaborate.
This project is licensed under the MIT License. See the LICENSE file for details.
Created with ❤️ to enable next-gen performance testing, analysis, and reporting powered by FastMCP and AI.
canyonlabz/mcp-perf-suite
August 22, 2025
April 13, 2026
Python