Curiosity · AI Model

DeepSeek Coder 33B Instruct

DeepSeek Coder 33B Instruct, released in November 2023, was DeepSeek AI's flagship open-weights code model before the DeepSeek-Coder V2 MoE series. Trained on two trillion tokens spanning 80+ programming languages with a project-aware packing strategy, it led open-weights HumanEval scores at launch and set a template that later Chinese-lab coding LLMs followed.

Model specs

Vendor
DeepSeek
Family
DeepSeek Coder
Released
2023-11
Context window
16,384 tokens
Modalities
text, code

Strengths

  • Led open-weights HumanEval and MBPP at launch
  • Project-aware pretraining handles multi-file context
  • Permissive DeepSeek license for commercial use

Limitations

  • Surpassed by DeepSeek-Coder V2 and Qwen2.5-Coder on modern benchmarks
  • No chat-persona or tool-use tuning in the base instruct model
  • 16K context is modest by 2026 standards

Use cases

  • Self-hosted coding copilot in enterprise environments
  • Fine-tuning target for domain-specific coding assistants
  • Classroom examples of open-weights coding LLMs
  • Research on repository-level code completion

Benchmarks

BenchmarkScoreAs of
HumanEval≈79%2023-11
MBPP≈70%2023-11
DS-1000top open-weights at launch2023-11

Frequently asked questions

What is DeepSeek Coder 33B Instruct?

DeepSeek Coder 33B Instruct is DeepSeek AI's 33-billion-parameter open-weights coding LLM, released in November 2023 and trained on two trillion tokens spanning 80+ programming languages.

Is DeepSeek Coder 33B still competitive?

For modern projects, DeepSeek-Coder V2 or Qwen2.5-Coder is usually a better choice. The 33B instruct model remains a strong baseline and useful fine-tuning base.

Where can I download DeepSeek Coder 33B?

Weights and instruct checkpoints are available on Hugging Face under the 'deepseek-ai' organisation with permissive licensing.

Sources

  1. arXiv — DeepSeek-Coder paper — accessed 2026-04-20
  2. Hugging Face — deepseek-ai/deepseek-coder-33b-instruct — accessed 2026-04-20