赛道Hyper | 追平全球顶级:千问3推理模型开源

华尔街见闻
Aug 06

作者:周源/华尔街见闻7月25日,阿里巴巴开源千问3推理模型。这是千问系列中首个采用混合专家MoE(Mixture of Experts)架构的代码模型,总参数达480B(4800亿),原生支持256K token的上下文,可扩展至1M长度,能帮助程序员完美完成写代码、补全代码等基础编程任务,大幅提升编程工作效率。混合专家模型(MoE)是一种高效的神经网络架构设计,核心思想是通过分工协作提升模型...

Source Link

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10