The era of subsidies will eventually conclude, revealing a hidden cost behind the glossy new AI features that users will ultimately bear. Technology commentator Ernie Smith recently cautioned that leading US AI firms, such as Anthropic and OpenAI, are leveraging massive subsidies to foster user dependency—an unsustainable model. When these subsidies recede, ordinary users and businesses will be left footing the bill. He stated plainly:
"All those Super Bowl ads and new product expenses will eventually come due."
Smith shared a personal example: while testing Anthropic's newly launched Claude Design product, he exhausted his daily quota with just two commands by 1 PM on a Friday. Concurrently, GitHub has announced a shift to usage-based billing for its Copilot service, signaling a quiet transformation in industry trends. Citing prominent AI skeptic Ed Zitron, Smith noted that mainstream media coverage of AI services often overlooks how much each task actually costs users.
Subsidies act as a trap: users are being conditioned for future exploitation. Smith's core argument is straightforward:
"These mainstream AI products are subsidized, and users have grown addicted to the subsidies. It's not sustainable, but the goal is to make you dependent enough that you might tolerate the costs when they arrive."
He echoed Ed Zitron's characterization of AI services as fundamentally different from traditional software:
"Generative AI services are, to a large extent, experimental products that operate unlike any modern software or hardware. You can't simply approach ChatGPT or Claude and expect it to start working efficiently."
Smith contends that the enormous expenses behind Super Bowl advertisements and continuous product launches will eventually demand repayment. "All the money spent on those Super Bowl ads and new products will come collecting," he wrote, adding:
"And you'll be trapped, paying not just subscription fees but also ever-increasing usage costs."
The strategy of major firms: locking in businesses with creative tools leads to heftier bills. Smith believes that leading providers like Anthropic are primarily targeting corporate clients—those willing to pay a premium for efficiency—which implies steeper costs. Anthropic recently announced integrations with popular creative tools, including Affinity, Adobe Creative Cloud, and the controversial 3D software Blender. Smith observed:
"I think Anthropic has identified a use case useful enough—and token-intensive enough—to justify significant corporate spending. You thought Creative Cloud was expensive already."
He predicts that numerous companies will be lured by such "efficiency tools," only to "find themselves spending substantial sums for marginal productivity gains." The problem, in his view, is that these firms prioritize efficiency over cost control.
A viable path: acknowledge the bill and opt for a smaller one. Smith's advice is not to abandon AI but to maintain financial awareness. He suggests that open-source or openly weighted models will follow a trajectory similar to open-source software: while many may overlook them due to the absence of advertising, sales teams, or flashy products like Sora, a segment of users will recognize that "burning vast sums on top-tier models isn't a sound investment when cheaper alternatives exist—including some that can operate on a laptop."
Smith concluded with a warning:
"The bill will arrive. And you'll want it to be a smaller one. Trust me."