Creativity · MCP — server
MCP Databricks Server
The MCP Databricks server connects LLM clients to a Databricks workspace. Databricks has released an official server that covers Unity Catalog browsing, SQL Warehouse execution, jobs API, and Mosaic AI model endpoints. It's ideal for data-engineer assistants that need to peek at tables, run diagnostic queries, or spin up a notebook job from chat. Authenticate with a personal access token or OAuth M2M for service scenarios.
MCP facts
- Kind
- server
- Ecosystem
- anthropic-mcp
- Language
- Python
- Transports
- stdio, http, sse
Capabilities
- Tools: run_sql, list_catalogs, describe_table, list_jobs, trigger_job, query_vector_search
- Resources: catalogs, schemas, tables, and job runs
- Auth: Databricks PAT or OAuth M2M service principal
Install
uvx databricks-mcp-server Configuration
{
"mcpServers": {
"databricks": {
"command": "uvx",
"args": ["databricks-mcp-server"],
"env": {
"DATABRICKS_HOST": "https://adb-xxxx.azuredatabricks.net",
"DATABRICKS_TOKEN": "dapi...",
"DATABRICKS_WAREHOUSE_ID": "abc123"
}
}
}
} Frequently asked questions
What can the MCP Databricks server do?
Browse Unity Catalog, run SQL through a Warehouse, trigger and monitor jobs, and query Vector Search endpoints. Exact tool set varies by version — check the repo for the latest capabilities.
Can I use it with Databricks Apps?
Yes. Databricks Apps hosts HTTP-transport MCP servers so you can share them with a team without everyone needing a local install.
How do I authenticate for a shared server?
Use an OAuth machine-to-machine (M2M) service principal with catalog-level grants. Avoid reusing a human PAT in shared deployments.
Sources
- Databricks MCP documentation — accessed 2026-04-20
- Databricks MCP GitHub — accessed 2026-04-20