PANews reported on April 16th that, according to the official WeChat account of Tongyi Lab, Alibaba's Tongyi Lab has open-sourced the sparse MoE model Qwen3.6-35B-A3B . With 35 billion total parameters and only 3 billion parameters activated during inference , it claims to significantly outperform its predecessor, Qwen3.5-35B-A3B , on programming benchmarks such as SWE-bench and Terminal-Bench, and is comparable to larger denser models like Qwen3.5-27B and Gemma-31B . The model natively supports text-image, multimodal, and spatial intelligence, achieving a RefCOCO score of 92.0 and an ODInW13 score of 50.8 . It is already open-sourced on Hugging Face and ModelScope , can be deployed locally, and is planned to be provided via Alibaba Cloud's Bailian platform as an API compatible with OpenAI and Anthropic protocols , supporting integration with programming assistants such as OpenClaw , Claude Code , and Qwen Code .
Tongyi Qwen3.6-35B-A3B is an open-source application that focuses on low-cost agent programming and multimodal reasoning.
Share to:
Author: PA一线
This content is for market information only and is not investment advice.
Follow PANews official accounts, navigate bull and bear markets together
Recommended Reading
PANews App
24/7 blockchain news tracking and in-depth analysis.

