Creativity · MCP — server
MCP Prefect Server
The MCP Prefect Server integrates Prefect Cloud and self-hosted Prefect servers. It lists flows and deployments, creates flow runs, reads run logs, and cancels failing work — letting an LLM act as an on-call copilot for Python-first data workflows.
MCP facts
- Kind
- server
- Ecosystem
- anthropic-mcp
- Language
- Python
- Transports
- stdio
Capabilities
- Tools: list_flows, list_deployments, create_flow_run, cancel_run, read_logs
- Resources: prefect://deployment/{id}, prefect://flow-run/{id}
- Auth: Prefect API key + workspace slug
Install
pipx install mcp-server-prefect Configuration
{
"mcpServers": {
"prefect": {
"command": "mcp-server-prefect",
"env": {
"PREFECT_API_URL": "https://api.prefect.cloud/api/accounts/{acct}/workspaces/{ws}",
"PREFECT_API_KEY": "${PREFECT_API_KEY}"
}
}
}
} Frequently asked questions
How is it different from the Dagster MCP server?
Prefect is flow-oriented (imperative Python); Dagster is asset-oriented (declarative lineage). Both expose similar runtime surfaces via MCP, but asset reasoning is easier in Dagster.
Can the LLM author flows?
Flow authoring still happens in Python. MCP is for runtime control — triggering, inspecting, cancelling.
Does it support Prefect work pools?
Yes — tools can read work-pool status and late runs, useful for diagnosing idle workers.
Sources
- Prefect REST API — accessed 2026-04-20
- Model Context Protocol — accessed 2026-04-20
- MCP servers repo — accessed 2026-04-20