Creativity · MCP — integration
MCP Integration: LibreChat
LibreChat is a popular open-source chat app that proxies many LLM providers behind one UI. Since 2025 it supports MCP servers as first-class tool providers, so you can configure a Postgres MCP and a Slack MCP and use them with OpenAI, Anthropic, and Ollama in the same LibreChat install.
MCP facts
- Kind
- integration
- Ecosystem
- anthropic-mcp
- Language
- TypeScript / Node.js
- Transports
- stdio, http
Capabilities
- Consumes MCP servers across OpenAI, Anthropic, Google, and local models
- Configure servers declaratively in librechat.yaml
- Per-user and per-endpoint tool selection
Install
docker compose up -d # from the LibreChat repo Configuration
# librechat.yaml
mcpServers:
filesystem:
command: npx
args: ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"]
github:
command: npx
args: ["-y", "@modelcontextprotocol/server-github"]
env:
GITHUB_TOKEN: "${GITHUB_TOKEN}" Frequently asked questions
How does LibreChat's MCP differ from Claude Desktop's?
Same protocol, different client surface. LibreChat runs server-side and is multi-user/multi-provider, while Claude Desktop is a single-user desktop client.
Does it work with all providers?
Tool calling works with any provider whose model supports function calling — OpenAI, Anthropic, Gemini, Mistral, and many open-source models through Ollama.
Safety concerns?
Server-side MCP means tokens live on the server, shared across users unless scoped. Use per-user credentials where possible and follow the LibreChat docs on MCP isolation.
Sources
- LibreChat Documentation — accessed 2026-04-20
- Model Context Protocol — accessed 2026-04-20