Creativity · MCP — server
MCP Hugging Face Hub Server
The MCP Hugging Face Hub Server integrates the huggingface_hub library: search for models and datasets, read model cards and dataset cards, list Spaces, and call Inference Endpoints or the free Inference API. It's the standard way for an MCP client to act as a copilot for ML experimentation against the Hugging Face ecosystem.
MCP facts
- Kind
- server
- Ecosystem
- anthropic-mcp
- Language
- Python (huggingface_hub)
- Transports
- stdio
Capabilities
- Tools: search_models, list_datasets, read_model_card, run_inference, list_spaces
- Resources: hf://model/{repo}, hf://dataset/{repo}, hf://space/{repo}
- Auth: HF_TOKEN (user or org access token)
Install
pipx install mcp-server-huggingface-hub Configuration
{
"mcpServers": {
"huggingface": {
"command": "mcp-server-huggingface-hub",
"env": {
"HF_TOKEN": "${HF_TOKEN}"
}
}
}
} Frequently asked questions
Can it run models locally?
Not directly. For local inference pair it with a transformers-based tool or Ollama MCP. This server targets the Hub API, not local execution.
Does it support Inference Endpoints?
Yes — run_inference targets either a deployed Inference Endpoint URL (paid) or the serverless Inference API (rate-limited).
Is dataset download safe?
Only when the agent is run in a sandbox. Datasets can be large; the server exposes metadata-only tools by default and makes downloads explicit.
Sources
- Hugging Face Hub Python Library — accessed 2026-04-20
- Hugging Face Inference Endpoints — accessed 2026-04-20
- Model Context Protocol — accessed 2026-04-20