Creativity · MCP — client

Continue as an MCP Client

Continue is an open-source AI assistant that plugs into VS Code and JetBrains. In addition to its own slash commands and context providers, recent versions speak the Model Context Protocol: point Continue at an MCP server and its tools, resources, and prompts become first-class citizens inside chat and agent mode.

MCP facts

Kind
client
Ecosystem
anthropic-mcp
Language
TypeScript (extension); servers can be any language
Transports
stdio

Capabilities

  • Spawns stdio MCP servers and registers their tools for chat + agent mode
  • Exposes MCP prompts as Continue slash commands
  • Per-model allowlist of which tools a model is permitted to call

Configuration

{
  "mcpServers": [
    {
      "name": "filesystem",
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/you/projects"]
    },
    {
      "name": "postgres",
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-postgres"],
      "env": { "POSTGRES_URL": "postgres://user:pass@localhost/app" }
    }
  ]
}

Frequently asked questions

Is Continue open source?

Yes — Apache 2.0 licensed, hosted at github.com/continuedev/continue. You can self-host and audit the client code.

Does it work with local models?

Yes. Continue supports Ollama, llama.cpp, vLLM, and any OpenAI-compatible endpoint, and it can still expose MCP tools to those models when the model supports tool calls.

Where do I configure MCP servers?

Edit ~/.continue/config.json (or config.yaml). Continue hot-reloads the file, so edits take effect immediately.

Sources

  1. Continue docs — accessed 2026-04-20
  2. Continue GitHub — accessed 2026-04-20