Curiosity · AI Model

Stable LM 2 1.6B

Stable LM 2 1.6B is Stability AI's sub-2-billion-parameter open-weight language model, released in early 2024. Trained on roughly 2 trillion tokens across seven languages (English, Spanish, German, French, Italian, Dutch, Portuguese), it offers competitive benchmarks for its size and is designed as a foundation for small multilingual assistants.

Model specs

Vendor
Stability AI
Family
Stable LM
Released
2024-01
Context window
4,096 tokens
Modalities
text

Strengths

  • Strong multilingual coverage for its size
  • Very small footprint — runs on CPU
  • Stability Community Licence allows research and SMB commercial use

Limitations

  • Short 4k context
  • Basic reasoning vs. larger Gemma / Phi models
  • Non-standard licence (Stability Community) may not fit all orgs

Use cases

  • Tiny multilingual chatbots for European languages
  • Edge inference on phones and embedded devices
  • Fine-tuning base for niche small-domain assistants
  • Academic benchmarking of sub-2B multilingual models

Benchmarks

BenchmarkScoreAs of
MMLU~41%2026-04
HellaSwag~69%2026-04
ARC-Challenge~43%2026-04

Frequently asked questions

What is Stable LM 2 1.6B?

Stable LM 2 1.6B is Stability AI's 1.6-billion-parameter open-weight multilingual language model, pretrained on roughly 2 trillion tokens across seven European languages.

How does Stable LM 2 1.6B compare to Gemma 2 2B?

Stable LM 2 1.6B is slightly smaller and broader in multilingual coverage, while Gemma 2 2B benefits from distillation and usually scores higher on English benchmarks. Both are reasonable tiny-model starting points.

Sources

  1. Stable LM 2 1.6B on HuggingFace — accessed 2026-04-20
  2. Stability AI — Stable LM 2 announcement — accessed 2026-04-20