A second fiddler to Nvidia in artificial-intelligence chips has quietly become the seventh-largest company in America by stock market value. Broadcom has returned 84% since I wrote about it here a year ago, after it had cracked the top 10, and 1,785% since I first recommended it in Barron's in 2016.
Back then Broadcom was a highly successful chip roll-up, buying companies that had attractive product margins but too much corporate overhead for their size. It was also a key supplier of iPhone innards. Today, the buzz is about a type of AI chip called an XPU. The reason Broadcom keeps ending up in the right market at the right time is Hock Tan, its CEO -- a Malaysian-born, M.I.T.-educated executive who once did stints at General Motors and PepsiCo.
See, in 1999, Hewlett Packard sliced off the worst part of its business, a measurement company it called Agilent Technologies, so it could focus on computers. And in 2005, private equity bought the worst part of Agilent, its boom-and-bust semiconductor unit, and called it Avago Technologies. To run the thing, it turned to Tan, whose latest job as a semiconductor CEO had ended in a buyout. Tan went on an acquisition spree, culminating in a 2016 deal for Broadcom, whose name Avago adopted, while keeping the ticker symbol AVGO.
Deals since then have largely involved software: CA Technologies, Symantec's enterprise security business, and, two years ago, VMware. Skeptics said the company was straying from its core expertise. But Tan has turned software into a source of steady, high-margin revenue to offset volatility in non-AI hardware. And in AI, demand is exploding.
If you're building an AI computing center, you're probably going to want GPUs, or graphics processing units, so named because their highly parallel processing got its first big commercial application firing pixels for videogames. The runaway leader here is Nvidia. Advanced Micro Devices just passed Intel in data center sales, but that's mostly owed to its CPUs, or central processing units, for routine tasks -- its GPUs for AI are off to a slow start. That leaves Nvidia with monstrous pricing power.
If you're a hyperscale computing company, you might want to use Nvidia for only some of your AI chips, and design others yourself to save cash, or to diversify your supply chain, or even to improve performance. For some tasks, you don't need an off-the-shelf GPU that can be all things to all customers. You just need a chip that can do a specific thing. Broadcom is top dog there in something called ASICs, or application-specific integrated circuits. Alphabet's Google was an early adopter of ASICs for AI and has been a key Broadcom customer for more than a decade.
The nomenclature here gets messy in a hurry. I'm seeing "XPU" a lot, although I don't think even Silicon Valley knows what it stands for. There aren't that many "x" words, and I'm ruling out the two musical instruments, xylophone and xaphoon. I've seen claims that it's an x-containing word, like extreme or auxiliary, or that the x is like an algebra variable: a whatever-you-want processing unit. Unhelpfully, Google refers to its XPUs as TPUs, as in tensor, a type of data array.
The important thing to know is that many, but not all, XPUs use ASICs, and if you want to make leading-edge ones, you're likely to call on a company that has a wealth of design and manufacturing experience, including deep ties with Taiwan Semiconductor Manufacturing. That means Broadcom primarily, along with companies like Marvell Technology and MediaTek.
A year ago, Broadcom had three official hyperscale AI ASICs customers. Now it's up to seven. That might not sound like a lot, but these are major players like Meta Platforms (Facebook, Instagram), ByteDance (TikTok), and OpenAI (ChatGPT), and there might be other, unnamed ones. New customers can continue scaling up in XPUs for years, and next-generation networking switches, which Broadcom also sells, enable clusters to put many more chips to work. Tan has said that his original three customers could reach a million chips apiece by 2027, for a combined market of $60 billion to $90 billion, with Broadcom collecting a dominant share.
Melius Research analyst Ben Reitzes reckons that will be good for AI chip revenue of $36 billion for Broadcom by 2027. And as new customers beyond the original three reach scale, growth could accelerate, with AI revenue reaching $70 billion later in the decade. For comparison, last quarter Broadcom reported AI revenue of $4.4 billion, up 46%, and predicted $5.1 billion for the current quarter. Total company revenue was $15 billion last quarter, up 20%. Free cash flow was $6.4 billion, up 44%.
That's the good news. Now for the price. "Why are you writing about a stock that's already up?" I'm sometimes asked. It's because that's what good stocks tend to do -- go up. So sometimes one that's already up goes up more. In 2016, when I first wrote about Broadcom, it had just gained 470% over five years, but that hasn't stopped it from gaining much more since.
A couple of things about that, though: One, early on, Broadcom was a value stock. I was making a case that the shares could rise to a midteens multiple of free cash flow. Now it trades at 32 times estimated free cash flow for the next four quarters. So what it does from here might be based as much on investor sentiment as economics.
Two, I'm pretty sure I'm an unremarkable stockpicker, Broadcom notwithstanding -- I saved that confession for the end, in case you stopped reading halfway through. My best guess is that Broadcom can ride AI spending higher over the coming year. But to believe it will rise another 80% or 100% in a hurry, you'd probably have to believe it will overtake Meta, Alphabet, or Amazon.com in market value -- a tall order.