MW Google may be Nvidia's biggest rival in chips - and now it's upping its game
By Britney Nguyen
The company's seventh-generation TPU chip is custom-built to handle tasks such as large-scale AI model training and high-volume AI inference
Google previewed its Ironwood TPUs on Thursday.
Google's custom artificial-intelligence chips have been lauded by some as the most credible alternative to Nvidia's graphics processing units, and the company is touting its latest version as its "most powerful and energy-efficient" yet.
The tech giant $(GOOGL)$ $(GOOG)$ announced on Thursday that it is rolling out its seventh-generation tensor processing unit, Ironwood, in the coming weeks. The chips are "purpose-built for the most demanding workloads," Google said, including training and reinforcement learning for large-scale AI models, and high-volume AI inferencing with low latency. Compared to the sixth-generation Trillium TPU, Ironwood is four times better for training and inferencing, or the process of running AI models after training, according to Google.
Google also said that it's previewing new instances for its custom-designed Axion central processing units, which are based on the Arm architecture. Its new N4A virtual machine is the company's most cost-effective so far, it said. The general-purpose virtual machines are designed to run common workloads such as web servers and databases. Google's first Arm-based bare-metal instance, C4A, will be available in preview soon, it said, adding that the instance is dedicated to supporting workloads such as developing its Android operating system and automotive systems.
"To thrive in an era with constantly shifting model architectures, software and techniques, you need a combination of purpose-built AI accelerators for model training and serving, alongside efficient, general-purpose CPUs for the everyday workloads, including the workloads that support those AI applications," the company said.
See more: Google may be sitting on a $900 billion gem that could disrupt Nvidia's dominance
Up to 9,216 Ironwood TPUs can be connected in a single unit - the POD, Google's AI supercomputer - using Google's Inter-Chip Interconnect networking, according to the company. That thereby avoids data bottlenecks for large and data-intensive AI models.
The connectivity not only helps with communication, but with allowing all the chips to access 1.77 Petabytes of shared high-bandwidth memory, which is essential for high-scale inferencing. Google's Optical Circuit Switching, or OCS, helps to route around service interruptions, the company said.
Google uses its TPUs to run its Search and YouTube algorithms, and to power its Gemini AI models. The company announced last month that it will allow Anthropic to access up to 1 million units of the custom silicon to train and run the AI startup's Claude large language models. Anthropic has counted on its main cloud provider Amazon (AMZN) Web Services for its Trainium and Inferentia chips.
The company described early responses to its Ironwood TPU as "overwhelmingly enthusiastic." Google added that Anthropic "expects to see impressive price-performance gains that accelerate" training to running its Claude models.
In September, D.A. Davidson's Gil Luria said Google's TPUs had closed the gap with Nvidia (NVDA) to become "the best alternative." If Google combined its TPU business and Google DeepMind AI research lab, Luria said it could be worth $900 billion.
"Should this business ever be spun-off, investors would be getting a leading AI accelerator supplier and frontier AI lab in one, making it arguably one of Alphabet's most valuable businesses," Luria said in his note.
Melius Research analyst Ben Reitzes shared similar positive sentiments about Google's TPUs in a note late last month where he called it "the most proven ASIC out there," referring to application-specific integrated circuits, which are custom-built to handle specific AI tasks.
Reitzes said Google has been able "to innovate quickly with Gemini" using its TPUs.
"The decision to develop this product early is now starting to inflect to the upside - delivering for both Broadcom's AI revenues and Google Cloud (GCP) growth," Reitzes said.
Read: Google wins high praise for its chip efforts - and that can help Broadcom's stock too
-Britney Nguyen
This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.
(END) Dow Jones Newswires
November 06, 2025 08:06 ET (13:06 GMT)
Copyright (c) 2025 Dow Jones & Company, Inc.