Alibaba's Next-Gen Qwen3.5 Model Emerges, Multiple Open-Source Releases Likely

Deep News
02/09

On February 9, a new pull request for integrating Qwen3.5 into the Transformers library appeared on the HuggingFace open-source project page. Industry insiders speculate that Alibaba's next-generation foundation model, Qwen3.5, is nearing its official release.

Some observers commented that this signals the start of a "crazy February" led by Chinese large language models. According to available information, Qwen3.5 utilizes a novel hybrid attention mechanism and is highly likely to be a natively visual-language model capable of visual understanding. Developers have further uncovered that Qwen3.5 may open-source at least a 2B dense model and a 35B-A3B MoE model.

Previously, The Information reported that Qwen3.5 would be open-sourced during the Chinese New Year holiday. In early February, Tang Jie, Chief Scientist at Zhipu AI, also posted on Weibo suggesting that numerous new models, including DeepSeek v4, Qwen3.5, and GLM-5, would be launched soon.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10