Chicago Federal Reserve President Austan Goolsbee has cast doubt on the prevailing narrative surrounding AI-driven productivity gains, challenging the logic behind potential interest rate cuts advocated by the Trump administration and the incoming Fed Chair.
Speaking at the Hoover Institution's annual monetary policy conference at Stanford University on Friday, Goolsbee warned that the widespread expectation of AI boosting productivity could itself push interest rates higher. If the technological revolution disappoints, the outcome could be worse: stagflation.
"The bigger the hype, the bigger the potential for a hangover," Goolsbee stated.
Citing Chicago Fed survey data, he noted that economists, technology professionals, and the general public all expect to gain approximately one additional percentage point of productivity growth per year over the next decade.
This widespread expectation itself poses a risk of economic overheating. His remarks challenge the "AI-driven rate cut" narrative promoted by incoming Fed Chair Warsh and the Trump administration.
Warsh is reportedly expected to be confirmed by the Senate on Monday as the 17th Chair of the Federal Reserve. He has previously stated that AI will usher in "the most productivity-enhancing wave in our lifetimes" and characterized it as a "structurally disinflationary" factor, suggesting the Fed would therefore have more room to cut rates.
U.S. Treasury Secretary Besant holds a similar view, likening the current situation to "the budding stage of a productivity boom, not unlike the 1990s."
**Expectations Themselves Are the Risk**
Goolsbee's core argument is that the macroeconomic impact of productivity gains depends on whether they arrive as a "surprise" or are "widely anticipated."
He explained that when productivity improves more than expected, inflation falls, and interest rates can follow lower. However, when the market has already fully priced in the technological dividend, as is the case with the current AI enthusiasm reflected in financial markets and corporate balance sheets, households and businesses rush to increase spending and investment before the productivity gains actually materialize.
This "borrowing from the future" behavior can cause current economic overheating, which in turn pushes interest rates higher.
Using the 1990s tech boom as an example, he pointed out that the Fed, led by then-Chairman Alan Greenspan, actually raised interest rates six consecutive times between 1999 and 2000 precisely to address the pressure from demand being pulled forward by anticipated productivity gains.
Goolsbee stated that he finds it "somewhat difficult to understand" the reasoning of Warsh and others who cite the 1990s analogy as a basis for cutting rates.
**If AI Fails, Stagflation Risk Emerges**
When pressed by former St. Louis Fed President James Bullard on what would happen if AI productivity expectations are not met, Goolsbee offered a more severe assessment.
He said that if the market continues to expect a boom, persistently borrowing future consumption and investment, and the technological dividend ultimately fails to materialize, the economy could enter a recession against a backdrop of overheated demand and persistently high inflation. He stated:
"You could easily get stagflation. This isn't a bubble; it's the fundamentals."
Goolsbee also listed several leading indicators he is monitoring:
* The wealth effect from housing prices driving consumer spending. * The data center construction boom pushing up land and chip costs, with this spillover already affecting industries unrelated to AI. * Changes in the number of workers leaving the labor market in anticipation of increased future wealth.
**Internal Divergence: Other Voices Question the Logic**
Goolsbee's assessment is not without dissent. At the same forum, Fed Governor Waller countered his core argument.
Waller stated that the wealth effect channel described by Goolsbee "has been in a lot of models for a long time" but "has not shown up consistently in the data." He added that if real-world factors—such as households' difficulty in easily mortgaging future income or more gradual spending adjustments—are incorporated into the models, the effect would be significantly weakened.
Atlanta Fed visiting scholar Steven Davis raised a different concern. Citing recent Atlanta Fed analysis, he noted that the average AI investment expenditure per firm is 14 times the median, indicating that the investment boom is highly concentrated among a few firms, not widespread.
University of Chicago economist Luigi Zingales presented another perspective. A New York Fed survey shows a growing number of residents expect to lose their jobs due to AI, which could actually increase the savings rate rather than pull forward consumption.
This points in the opposite direction of Goolsbee's concern. Goolsbee himself acknowledged that this dynamic could indeed point to the opposite conclusion.