Creativity · MCP — integration

Using MCP Servers via OpenAI Agents SDK

OpenAI's Agents SDK ships native MCP support — no extra adapter needed. The `MCPServerStdio`, `MCPServerSse`, and `MCPServerStreamableHttp` classes let your agent discover and call tools from any compliant MCP server. You attach a server to an Agent, and its tools appear alongside native function tools. It works with GPT-5 models, Responses API, and handles caching of tool lists for you.

MCP facts

Kind
integration
Ecosystem
anthropic-mcp
Language
Python / TypeScript
Transports
stdio, http, sse

Capabilities

  • MCPServerStdio / MCPServerSse / MCPServerStreamableHttp classes
  • Tool list caching via cache_tools_list flag
  • Works with OpenAI Agents SDK handoffs and guardrails

Install

pip install openai-agents

Configuration

from agents import Agent
from agents.mcp import MCPServerStdio

async with MCPServerStdio(
    params={
        "command": "npx",
        "args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
    },
    cache_tools_list=True,
) as fs_server:
    agent = Agent(
        name="Assistant",
        instructions="Use the filesystem tools to help the user.",
        mcp_servers=[fs_server],
    )

Frequently asked questions

Does OpenAI natively support MCP?

Yes. The OpenAI Agents SDK (Python and TypeScript) includes MCP server classes. The Responses API likewise accepts MCP-style tool specs for remote servers.

What's cache_tools_list?

A boolean flag that caches the list_tools result for the lifetime of the server object. Turn it on when the server's tool set is static — it avoids a round-trip on every agent run.

Can I use Claude models with this SDK?

The Agents SDK is OpenAI-flavored; to drive Claude specifically, use Anthropic's Claude Agent SDK or LangGraph + langchain-mcp-adapters instead. The MCP server you connect to is the same — only the agent framework differs.

Sources

  1. OpenAI Agents SDK — MCP — accessed 2026-04-20
  2. OpenAI Agents SDK GitHub — accessed 2026-04-20