Alibaba has unveiled Qwen3, the newest iteration of its open-source large language model (LLM) series, aimed at advancing AI capabilities across reasoning, multilingual support, and cost-efficient deployment.
The new release comprises six dense models and two Mixture-of-Experts (MoE) models and runs on the hybrid reasoning mechanism, which is the org’s first LLMs capable of switching between “thinking” and “non-thinking” modes. This design allows Qwen3 to perform complex tasks such as math, code generation, and logic deduction in thinking mode, while quickly handling general-purpose queries in non-thinking mode.
Developers using the API can even control how long the model stays in thinking mode, with a maximum context length of 38,000 tokens. Thus, the flexibility helps balance intelligence with computational efficiency, especially in large-scale applications with the use of the Qwen3-235B-A22B MoE, achieving substantial cost savings without sacrificing performance.
Qwen3 models were trained on 36 trillion tokens, which is double the data used for the previous generation, Qwen2.5. As a result, they deliver improvements across multilingual tasks, reasoning, and tool use, and they now support 119 languages and dialects and outperform previous Qwen models in translation and instruction-following.
Support for advanced agent-based tasks is also built in with Qwen3 natively integrates the Model Context Protocol (MCP) and supports robust function-calling too.
Benchmarks reflect the performance leap as Qwen3 ranks among top performers on AIME25 (math), LiveCodeBench (coding), BFCL (function-calling), and Arena-Hard (instruction-tuning). Alibaba attributes this to innovations in architecture and a four-stage training pipeline that includes long chain-of-thought (CoT) pretraining, reasoning-focused reinforcement learning (RL), thinking mode fusion, and general RL refinement.
Availability
The entire Qwen3 lineup is freely available through ugging Face, GitHub, and ModelScope, with a demo up at chat.qwen.ai while API integration via Alibaba’s Model Studio is coming soon.