2025: The Year Qwen Became Essential
In 2025, Alibaba Cloud transformed Qwen from a promising project into a global benchmark for open-source AI. From language models to code tools, multimodal capabilities, and image generation, the Qwen ecosystem now covers the entire AI spectrum.
January: Qwen2.5-Max Challenges DeepSeek
January 28, 2025 — Alibaba launches Qwen2.5-Max, a MoE (Mixture of Experts) model trained on over 20 trillion tokens.
| Benchmark | Qwen2.5-Max | DeepSeek V3 |
|---|---|---|
| Arena-Hard | Superior | - |
| LiveBench | Superior | - |
| LiveCodeBench | Superior | - |
| GPQA-Diamond | Superior | - |
| MMLU-Pro | Competitive | - |
This launch immediately positions Qwen as a serious competitor to proprietary models.
March: Multimodal with Qwen2.5-Omni
March 27, 2025 — Qwen2.5-Omni-7B arrives, a model capable of processing text, images, audio, and video simultaneously.
Thinker-Talker Architecture
| Component | Role |
|---|---|
| Thinker | Processes multimodal inputs |
| Talker | Generates streaming voice responses |
| TMRoPE | Synchronizes video and audio temporally |
This 7B parameter model offers real-time conversations with natural speech synthesis, rivaling much larger models.
April: Qwen3 Revolutionizes Open-Source
April 29, 2025 — The Qwen3 family arrives with a complete range of models.
Dense Models
| Model | Parameters | Context |
|---|---|---|
| Qwen3-32B | 32B | 128K |
| Qwen3-14B | 14B | 128K |
| Qwen3-8B | 8B | 128K |
| Qwen3-4B | 4B | 32K |
| Qwen3-1.7B | 1.7B | 32K |
| Qwen3-0.6B | 0.6B | 32K |
MoE Models
| Model | Total | Active | Context |
|---|---|---|---|
| Qwen3-235B-A22B | 235B | 22B | 128K |
| Qwen3-30B-A3B | 30B | 3B | 128K |
Key Innovations
- 36 trillion tokens of training (2x Qwen2.5)
- 119 languages supported
- Hybrid thinking modes: deep reasoning or fast response
- Performance comparable to DeepSeek-R1, o1, and o3-mini
July: Qwen3-Coder and the CLI
July 22, 2025 — Alibaba launches its most powerful code model: Qwen3-Coder-480B-A35B.
Specifications
| Aspect | Detail |
|---|---|
| Total parameters | 480B |
| Active parameters | 35B |
| Native context | 256K tokens |
| Extended context | 1M tokens (YaRN) |
| Code data | 7.5T tokens (70% code) |
Performance
State-of-the-art among open-source models on:
- Agentic Coding: comparable to Claude Sonnet 4
- Agentic Browser-Use
- Agentic Tool-Use
- SWE-Bench Verified: without test scaling
Qwen Code CLI
Alibaba also launches Qwen Code, an open-source CLI fork of Gemini CLI:
npm i -g @qwen-code/qwen-code
Compatible with Claude Code, Cline, and Alibaba Cloud’s DashScope API.
July: Translation with Qwen-MT
July 24, 2025 — Qwen-MT arrives for multilingual translation.
- Support for 92 official languages and dialects
- High-quality translation
- Optimized for speed
August: Image Generation with Qwen-Image
August 4, 2025 — Qwen-Image is a 20B parameter foundation model based on the MMDiT architecture.
Highlights
- Native text rendering: multi-line texts, semantic paragraphs
- Precise image editing
- Optimized MMDiT architecture
August 19, 2025 — Qwen-Image-Edit extends these capabilities to image editing, combining Qwen2.5-VL for semantic control and a VAE encoder for appearance.
🔗 Qwen-Image | Qwen-Image-Edit
September: Safety with Qwen3Guard
September 23, 2025 — Qwen3Guard is the first safety model in the Qwen family.
| Feature | Description |
|---|---|
| Real-time detection | Prompt and response analysis |
| Risk levels | Graded classification |
| Risk categories | Categorized detection |
| Multilingual | English, Chinese, other languages |
This model allows integrating safety guardrails into Qwen-based applications.
2025 Release Recap
| Date | Release | Type |
|---|---|---|
| Jan 28 | Qwen2.5-Max | MoE LLM |
| Mar 27 | Qwen2.5-Omni-7B | Multimodal |
| Apr 29 | Qwen3 (8 models) | LLM |
| Jul 22 | Qwen3-Coder-480B | Code |
| Jul 22 | Qwen Code CLI | Tool |
| Jul 24 | Qwen-MT | Translation |
| Aug 4 | Qwen-Image | Image Generation |
| Aug 19 | Qwen-Image-Edit | Image Editing |
| Sep 23 | Qwen3Guard | Safety |
What This Means
In 2025, Alibaba demonstrated that an open-source AI ecosystem can rival proprietary giants. With models covering language, code, multimodal, images, and safety, Qwen offers a comprehensive and accessible alternative.
Alibaba’s strategy—releasing high-performance models under open licenses—accelerates adoption and innovation while democratizing access to cutting-edge AI technologies.