Alibaba's Next-Gen Qwen3.5 Model Emerges, Multiple Open-Source Releases Likely

Deep News
02/09

On February 9, a new pull request for integrating Qwen3.5 into the Transformers library appeared on the HuggingFace open-source project page. Industry insiders speculate that Alibaba's next-generation foundation model, Qwen3.5, is nearing its official release.

Some observers commented that this signals the start of a "crazy February" led by Chinese large language models. According to available information, Qwen3.5 utilizes a novel hybrid attention mechanism and is highly likely to be a natively visual-language model capable of visual understanding. Developers have further uncovered that Qwen3.5 may open-source at least a 2B dense model and a 35B-A3B MoE model.

Previously, The Information reported that Qwen3.5 would be open-sourced during the Chinese New Year holiday. In early February, Tang Jie, Chief Scientist at Zhipu AI, also posted on Weibo suggesting that numerous new models, including DeepSeek v4, Qwen3.5, and GLM-5, would be launched soon.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10