Insights from OpenAI's Stargate Challenges for Anthropic

Deep News
昨天

Over the past year, a joke has circulated within the data center industry—and even inside OpenAI: the $500 billion Stargate initiative resembles more of an improvisational concept than a meticulously planned blueprint. Recent investigations reveal that the current Stargate project bears little resemblance to the grand $500 billion vision promoted at a White House event over a year ago. Stargate now lacks a dedicated team and has no clearly defined core role in data center construction, ultimately serving as a flexible umbrella term for OpenAI's computing power collaborations. Longtime readers may recall that in March 2024, "Stargate" referred to a planned $100 billion supercomputer partnership between OpenAI and Microsoft.

This situation highlights a deeper conflict between the ambitions of AI labs and the costly realities of building and investing in infrastructure. OpenAI's evolving approach may offer valuable lessons for competitors like Anthropic, which are also pursuing multi-gigawatt computing expansions.

**Vision First, Funding Later** When Stargate was initially announced, the three partners—OpenAI, Oracle, and SoftBank—touted a staggering figure: $500 billion in investment to build 10 gigawatts of computing power. However, these companies had not yet identified the source of the funding. As the project progressed, OpenAI became increasingly reliant on Oracle, which holds an investment-grade credit rating and offered more favorable financing terms than OpenAI could secure independently.

Although the public announcement of Stargate generated significant attention, the project ultimately had to address practical challenges, including implementation and funding. Anthropic has internally discussed targets to secure approximately 10 gigawatts of computing power in the coming years and is likely conducting similar assessments—presenting goals to partners and potential investors to gauge their willingness to contribute.

External analysts expect Anthropic to seek a financially strong partner, similar to Oracle, to help secure favorable financing conditions. Unlike OpenAI, Anthropic has not publicly announced its ambitions at high-profile events like the White House, allowing it to negotiate details discreetly.

**Risk Sharing** One of Stargate's most notable contributions to AI infrastructure may be the unconventional risk-sharing model agreed upon by OpenAI and Oracle. The two companies are collaborating on a 4.5-gigawatt data center project, sharing certain economic risks.

In simple terms: - If the project faces delays or exceeds its budget, OpenAI will bear additional costs. - If costs are lower than expected, OpenAI will also benefit.

Given the current trends of rising AI infrastructure expenses and frequent delays, OpenAI's computing costs are likely to increase. This type of structure is uncommon, as cloud service customers typically do not assume data center construction cost fluctuations. However, the scale of this agreement is unprecedented. It remains unclear whether OpenAI can withstand such cost variability or how significant potential increases might be.

For Oracle investors, this arrangement is likely favorable, as they closely monitor how the rapidly growing cloud business impacts the company's profit margins. Anthropic is expected to watch this partnership model closely, as it will face similar risk-sharing decisions when signing additional computing agreements in the coming years.

**Proceeding Cautiously with Self-Built Infrastructure** OpenAI's experience suggests that Anthropic should exercise caution if it considers building its own data centers entirely. OpenAI's plans for self-built data centers have faced repeated revisions and delays, indicating that Anthropic may be wise to postpone full self-construction for several years.

Previous reports indicate that Anthropic plans to increase long-term direct leasing of data center capacity—rather than merely renting chips from cloud providers—to gain greater control over computing deployment. Direct leasing represents a middle ground, offering companies more influence over location and capacity without requiring the capability to build data centers from scratch.

While self-built infrastructure may eventually prove financially justified for AI labs, it may not be the optimal choice in the race to rapidly secure computing power.

**Musk's Turbine Troubles** Elon Musk's xAI plan to power its data centers in Mississippi and Tennessee with dozens of small gas turbines is facing strong opposition. At a recent state air emissions permit hearing in Southaven, Mississippi, hundreds of protesters urged the state's Department of Environmental Quality to reject the project.

This controversy may serve as an early test of whether tech giants can bypass traditional utility companies by quickly deploying off-grid, temporary turbine setups.

**Other News** - Cloverleaf Infrastructure, a two-year-old data center power company founded by former Microsoft energy executives, is receiving acquisition offers. - SemiAnalysis, a research firm specializing in chips and infrastructure, is considering raising funds to invest in startups. - DG Matrix, a startup developing new transformers for AI data centers, has completed a $60 million funding round led by Engine Ventures.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10