Altman Outlines OpenAI's Latest Strategy: Enterprise API Revenue Surpasses Consumer End, New Model Launch in Q1 Next Year, Compute Power Determines Revenue Ceiling

Deep News
12/19

As the AI race intensifies into close competition, market focus is shifting: model superiority is no longer the sole concern; the ability to consistently convert model capabilities into revenue and cash flow is the new differentiator.

In a rare one-on-one interview on the latest episode of *The Big Technology Podcast*, OpenAI CEO Sam Altman systematically addressed the most pressing external questions across business, product, and infrastructure dimensions. Multiple statements conveyed a clear signal: OpenAI stands at a critical inflection point transitioning from a "phenomenal product company" to an "enterprise AI platform."

**Why Has ChatGPT Remained Largely Unchanged for Three Years? The Answer Lies in "Generality"** Altman admitted he initially thought ChatGPT's chat interface wouldn't last this long, but reality proved that general, low-barrier interaction methods were severely underestimated. However, he explicitly stated that ChatGPT's ultimate form won't just be a "dialog box": future AI will work proactively rather than reactively, generate different interfaces for different tasks, run continuously in the background while interrupting users only at critical moments, and evolve from a "tool" to an "intelligent agent." This is the underlying logic behind OpenAI's simultaneous advancement of browsers, devices, and agents—the goal isn't to create a smarter chatbot but to become the "default intelligence layer." Altman reiterated that "memory" is one of AI's most long-term valuable capabilities, with current AI memory functions still in the "GPT-2 era." Future AI will remember every word you've said and every decision you've made, capturing not just facts but preferences, emotions, and habits—something human assistants could never achieve.

**Enterprise API Revenue Surpasses Consumer End, Growth Engine Shifts** On the business front, Altman clarified that OpenAI isn't "pivoting" from a consumer company into the enterprise market but is instead capitalizing on natural momentum. To date, OpenAI boasts over 1 million enterprise users, with API business growth outpacing ChatGPT itself. This year, API contributions to overall growth have even exceeded those of consumer products. In his view, enterprises don’t need fragmented AI functionalities but a complete, unified, and scalable AI platform. He proposed that future enterprise IT architectures will simultaneously host "traditional cloud" and "AI cloud." OpenAI isn’t trying to replicate AWS but to build an intelligent infrastructure layer capable of handling trillions of tokens.

**When Will GPT-6 Arrive? New Models Progress, but Naming Becomes Irrelevant** Regarding the model roadmap, Altman didn’t provide a clear timeline for "GPT-6" but confirmed OpenAI will launch a significantly advanced model in Q1 next year compared to GPT-5.2. Model upgrades continue unabated, but naming conventions are no longer the focus. On the hardware front, OpenAI is preparing to release a series of small AI devices rather than a single hit product. Altman predicts computing devices will fundamentally change—from passive tools responding to commands to intelligent systems that proactively understand users' lives, contexts, and collaborative relationships. Under this vision, current screen-and-app-centric devices are ill-suited for an "AI-first" world. Next-gen hardware will become the key gateway for long-term memory, continuous perception, and proactive decision-making.

**Why Bet Big on Compute? Revenue Bottlenecks Lie in Infrastructure, Not Demand** Compared to whether AGI has been achieved, Altman is more concerned about a market-overlooked issue: are existing AI capabilities being fully utilized? His answer is unequivocal—no. Enterprises remain stuck in superficial applications like "having AI draft copy, tweak code, or summarize." Organizationally, AI is still seen as an auxiliary tool rather than a true "team member" participating in decision-making, execution, and collaboration. This isn’t due to weak models but enterprises' unpreparedness to restructure workflows, roles, and responsibilities around AI. Thus, even if model capabilities plateau temporarily, existing potential alone holds massive economic value—just not yet systematically activated. Altman believes this "capability surplus" directly alters the nature of compute investments. At this stage, compute isn’t just an expanding cost center but the critical constraint determining whether latent demand translates into actual revenue, given sufficient model capabilities. He emphasized that compute investment is essentially pre-positioning for future usage. Over the past year, OpenAI’s compute capacity grew roughly threefold, with revenue growth keeping pace and no idle resources or monetization challenges. In other words, doubling compute would nearly double revenue. The real risk, he argues, isn’t compute surplus but whether infrastructure will be ready when society and enterprises finally complete structural adaptation to AI. That will be the defining moment of the next-phase AI race, where growth constraints shift from model capabilities to pre-laid compute and platform breadth.

**Competitive Focus Shifts: From Model Parameters to Platform Breadth** Facing rapid catch-up from models like Gemini and DeepSeek, Altman doesn’t shy away from competitive pressure. He admits OpenAI feels the heat, internally triggering frequent "code red" alerts, but doesn’t believe it’s losing its lead. In his view, model capability gaps will eventually narrow; what truly sets leaders apart are productization abilities, distribution efficiency, and capacity to build long-term user relationships. The "distribution vs. product" debate is a false dichotomy—ChatGPT itself is distribution, which must rest on sustainably evolving products. ChatGPT became the world’s largest AI gateway precisely due to its ultra-low barrier and generality. Currently, weekly active users approach 900 million, and this scale effect is reinforcing OpenAI’s enterprise market competitiveness.

**Podcast Transcript Highlights** *Sam Altman:* "You know the $1.4 trillion figure—we’ll deploy it gradually over a long period. I wish we could move faster. It’s best to explain upfront how these numbers will work."

*Alex Kantrowitz:* "Exponential growth is often hard to grasp intuitively. Today, OpenAI CEO Sam Altman joins us to discuss OpenAI’s winning plan amid intensifying competition, rational infrastructure investment, and when an IPO might arrive."

*Altman on Compute:* "If we had double the compute today, revenue would nearly double too. The constraint isn’t demand but infrastructure."

*On Enterprise Growth:* "We have over 1 million enterprise users. API growth this year even surpassed ChatGPT’s. Enterprises want one unified AI platform."

*On Future Models:* "A significantly better model than GPT-5.2 is coming in Q1 next year. Naming like ‘GPT-6’ isn’t the focus—capabilities are."

*On AGI:* "Today’s models are incredibly smart but lack continuous learning. Once they can realize gaps, learn overnight, and solve them next day—that’s closer to AGI."

*On IPO:* "Being a public CEO excites me 0%. But as a company, going public has pros and cons. We’ll need capital and may eventually exceed private shareholder limits."

*On Compute’s Role in Science:* "Massive compute can accelerate discoveries. Early signs show AI-aided science is taking off—small wins now, big breakthroughs in five years."

*Final Note:* Altman closed by stressing that OpenAI’s edge lies not just in models but in integrating compute, product, and distribution to serve scaled demand—a trifecta he believes will sustain leadership as AI’s economic impact unfolds.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10