5月7日,理想汽车举行第二季AI TALK。在被问及“你从DeepSeek身上学到了什么”时,理想汽车董事长CEO李想认为,DeepSeek-V3是一个MoE(混合专家模型)。“MoE(混合专家模型)是个非常好的架构”“它相当于把一堆专家组合在一起,每一个对应一个专家能力”。
李想进一步解释,DeepSeek展示了如何构建专家能力的最佳实践,共分四步。第一步一定要先搞研究。“第一步相当重要,就是任何的时候,当我们想去改变能力和提升能力的时候,第一步一定是搞研究”。第二步才是搞研发,第三步是将能力表达出来,第四步才是将能力变成业务的价值。
(文章来源:广州日报新花城)
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.