Curiosity · AI Model
Aya 23 35B
Aya 23 35B, released by Cohere For AI in May 2024, is a multilingual open-weights LLM built on the Command R base and fine-tuned on instruction data spanning 23 languages. It was one of the strongest open-weights multilingual models at launch, outperforming Aya 101 on every covered language despite supporting fewer total languages.
Model specs
- Vendor
- Cohere For AI
- Family
- Aya
- Released
- 2024-05
- Context window
- 8,192 tokens
- Modalities
- text
Strengths
- Strong per-language quality across 23 languages
- Open weights under CC-BY-NC 4.0 for research use
- Backed by Cohere For AI's open-science programme
Limitations
- Non-commercial research license limits production use
- Covers 23 languages — narrower than Aya 101's breadth
- No vision or tool-use capabilities
Use cases
- Multilingual research and academic experiments
- Region-specific chatbots and content moderation
- Fine-tuning baselines for underrepresented languages
- Benchmarks for multilingual open-weights models
Benchmarks
| Benchmark | Score | As of |
|---|---|---|
| AyaEval multilingual | state-of-the-art open-weights at launch | 2024-05 |
| MMLU (en) | ≈67% | 2024-05 |
Frequently asked questions
What is Aya 23 35B?
Aya 23 35B is Cohere For AI's 35-billion-parameter open-weights multilingual model, released in May 2024 with high-quality instruction tuning across 23 languages.
What is the license for Aya 23?
Aya 23 is distributed under CC-BY-NC 4.0 for research use. Commercial deployments require a separate agreement with Cohere.
How is Aya 23 different from Aya 101?
Aya 101 covered 101 languages with lower per-language quality. Aya 23 narrows to 23 languages but delivers substantially stronger per-language performance.
Sources
- Cohere For AI — Aya 23 announcement — accessed 2026-04-20
- Hugging Face — CohereForAI/aya-23-35B — accessed 2026-04-20