Creativity · MCP — server

MCP Kubeflow Server

The MCP Kubeflow Server integrates the Kubeflow Pipelines (KFP) API and, optionally, the Kubeflow Training Operator. MCP clients can upload pipelines, create runs with parameters, watch run status, and fetch logs — useful for MLOps teams running on EKS, GKE, or AKS who want an LLM copilot for pipeline operations.

MCP facts

Kind
server
Ecosystem
anthropic-mcp
Language
Python (kfp SDK)
Transports
stdio

Capabilities

  • Tools: list_pipelines, create_run, get_run, list_experiments, fetch_logs
  • Resources: kubeflow://pipeline/{id}, kubeflow://run/{id}
  • Auth: OIDC ID token or Kubernetes service account token

Install

pipx install mcp-server-kubeflow

Configuration

{
  "mcpServers": {
    "kubeflow": {
      "command": "mcp-server-kubeflow",
      "env": {
        "KFP_ENDPOINT": "https://kubeflow.example.com/pipeline",
        "KFP_TOKEN": "${KFP_TOKEN}"
      }
    }
  }
}

Frequently asked questions

Is Kubeflow still current in 2026?

Yes — Kubeflow remains the leading Kubernetes-native ML orchestration project. Many teams use KFP v2 with Argo Workflows underneath.

Can it launch distributed training?

With the Training Operator integration enabled, the server can create TFJobs, PyTorchJobs, and MPIJobs. That's typically gated by approval given resource cost.

How does it compare to Dagster/Prefect MCP?

Kubeflow is purpose-built for Kubernetes-based ML workflows and integrates with KServe. Dagster/Prefect are general orchestration tools.

Sources

  1. Kubeflow Pipelines — accessed 2026-04-20
  2. Model Context Protocol — accessed 2026-04-20
  3. MCP servers repo — accessed 2026-04-20