Microsoft Plans to Boost Computing Clusters to Support Self-Developed Large Models in Competition Against OpenAI and Other Rivals

Deep News
09/12

Microsoft plans to expand its computing infrastructure for training its own artificial intelligence (AI) models, hoping to compete with rivals such as OpenAI and Anthropic.

Mustafa Suleyman, head of Microsoft's consumer AI business, told employees at an all-hands meeting on Thursday that the company will make "significant investments" in its own computing clusters for model training.

According to attendees, Suleyman told staff that having AI "self-sufficiency" capabilities is crucial for a company of Microsoft's scale. In addition to building self-developed models, Microsoft will also deepen its cooperation with OpenAI and collaborate with other AI model development companies.

The world's largest software company currently relies primarily on OpenAI's ChatGPT to power AI capabilities across its products. However, signs of tension have emerged as both parties launch competing products and seek to expand their partnerships. Suleyman, co-founder of DeepMind, joined Microsoft last year to lead self-developed models and build consumer-facing AI business.

Last month, Microsoft released its first self-developed large model developed under Suleyman's leadership. The model was trained using 15,000 NVIDIA H100 chips. Suleyman added that cutting-edge models from Meta, Google under Alphabet, and Elon Musk's xAI use clusters approximately 6-10 times larger than Microsoft's, meaning Microsoft completed training with higher efficiency.

Microsoft CEO Satya Nadella added that Microsoft will adopt a multi-model strategy across all products, using AI models that customers prefer. Earlier this week, reports indicated that Microsoft plans to adopt Anthropic's models in some of its products.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10