Creativity · MCP — client

MCP Client: Open WebUI

Open WebUI is the widely deployed self-hosted web interface originally built for Ollama, now supporting OpenAI-compatible APIs and MCP servers. It acts as an MCP client that injects tool calls into chats with whichever underlying model you run locally — Llama, Qwen, DeepSeek, or any OpenAI-compatible endpoint.

MCP facts

Kind
client
Ecosystem
anthropic-mcp
Language
Python / Svelte
Transports
stdio, http

Capabilities

  • Consumes any MCP server over stdio or HTTP
  • Per-model tool selection — enable different tool sets per chat
  • Runs on Docker / Kubernetes for self-hosted deployments

Install

docker run -p 3000:8080 ghcr.io/open-webui/open-webui:main

Configuration

// Admin Settings → MCP: register via UI, or mount a config file
{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/data"]
    }
  }
}

Frequently asked questions

Does Open WebUI support MCP with Ollama models?

Yes — tool calling is driven by the UI, so any function-calling-capable model (many modern Llama/Qwen variants) can invoke MCP tools.

Can I share server configs across users?

Admin-scoped MCP server definitions apply org-wide; user-scoped ones apply only to the authenticated user. Great for multi-tenant self-hosting.

Any Docker Compose tips?

Mount your MCP server binaries and config into the Open WebUI container so they can be executed as child processes, or expose an HTTP MCP server separately.

Sources

  1. Open WebUI Documentation — accessed 2026-04-20
  2. Model Context Protocol — accessed 2026-04-20