Curiosity · AI Model

GPT-5

GPT-5 is OpenAI's 2026 flagship large language model — a unified multimodal system with native audio, vision, and code, a switchable reasoning mode, and the deepest developer ecosystem of any model in the market. It is the default general-purpose choice for most teams evaluating AI infrastructure.

Model specs

Vendor
OpenAI
Family
GPT-5
Released
2025-08
Context window
400,000 tokens
Modalities
text, vision, audio, code
Input price
$10/M tok
Output price
$40/M tok
Pricing as of
2026-04-20

Strengths

  • Broadest ecosystem — Responses API, Realtime API, Assistants, Batch, Structured Outputs
  • Best-in-class audio (native speech in/out), competitive vision
  • Strong instruction following with JSON-mode + structured outputs

Limitations

  • Slightly behind Claude Opus 4.7 on long-horizon coding agents
  • Thinking mode adds noticeable latency on hard problems
  • Context window smaller than Claude Opus 4.7's 1M

Use cases

  • General-purpose chat and copilot assistants
  • Voice / real-time audio agents via Realtime API
  • Multimodal understanding — image, PDF, screen capture
  • Enterprise deployments that standardise on OpenAI / Azure OpenAI

Benchmarks

BenchmarkScoreAs of
MMLU-Pro≈87%2026-04
SWE-bench Verified≈70%2026-04
AIME 2025≈94%2026-04

Frequently asked questions

What is GPT-5?

GPT-5 is OpenAI's 2026 flagship large language model — a unified multimodal system with text, vision, audio, and code, a switchable reasoning mode, and the deepest developer ecosystem of any frontier model.

How does GPT-5 compare to Claude Opus 4.7?

GPT-5 is stronger on multimodal (particularly audio) and has a broader API surface. Claude Opus 4.7 leads on long-context and long-horizon coding agents and has a larger 1M-token context window.

What is GPT-5's context window?

As of April 2026, GPT-5 offers a 400,000-token context window across the Responses API. Batch and long-context modes are available for specific use cases.

Sources

  1. OpenAI — Models — accessed 2026-04-20
  2. OpenAI — Pricing — accessed 2026-04-20