Rapid investments in AI infrastructure have sparked concerns that the recent tech boom could turn into a bubble. Is Meta Platforms, Inc., along with the rest of Silicon Valley, pouring too much money into AI?
"Of course, no Meta executive would answer 'yes'," said Alex Schultz, Meta's Chief Marketing Officer and VP of Analytics, in an interview at the Web Summit tech conference in Lisbon.
Meta plans to invest up to $72 billion in AI infrastructure this year and has already announced that spending will continue to rise next year. CEO Mark Zuckerberg stated earlier this year that he would rather "overspend by tens of billions" than fall behind in developing superintelligence.
Other tech giants like Amazon, Google, Microsoft, and private AI firms such as OpenAI are also setting record-high investments in AI-related fields, including chips, data center construction, and competitive salaries to attract and retain top AI research and engineering talent.
"Aggressive, But Not Insane"
Despite the staggering figures, Schultz argued that compared to historical bubbles—measured by market value or industry revenue—the current trend is not excessive. He drew a parallel to the U.S. railroad bubble of the late 19th century: "It looks aggressive, but it's not insane."
In October, Goldman Sachs analysts estimated in a research report that U.S. AI-related investments account for less than 1% of GDP, whereas early tech booms (including railroad construction) reached 2% to 5% of GDP.
Investments Are Paying Off
Schultz noted that Meta's AI investments are already generating billions in revenue by improving ad tools and content-ranking algorithms. The company is projected to earn around $200 billion this year, with a market cap of approximately $1.5 trillion.
He highlighted that Meta's biggest AI transformation is its more advanced content recommendation system. This improvement is crucial because users spend most of their time on Facebook and Instagram browsing "unconnected content"—posts not from friends or followed pages/groups.
Video Generation Consumes Massive Energy
Schultz mentioned Meta AI's newly launched Vibes Feed—a short-video, fully AI-generated content stream—which "could define much of the company's future direction" and has shown strong user retention.
Video-generation models require significantly more computing power than text or image models, leading to massive energy consumption and potential strain on power grids and water resources. Applications like OpenAI's Sora have raised debates: Are their entertainment values worth the environmental costs?
"Vibes isn't huge—it won't drain lakes or require multiple nuclear plants," Schultz said, adding that it's just one of many experiments the company is running to train and refine AI models.
AI Boom Sparks Nuclear Safety Discussions
Schultz pointed out that the AI boom has also spurred productive discussions about nuclear plant safety and desalination facility utilization. "Overall, humanity is fully capable of creating far more abundant resources than we currently have," he said.