From Participant to Leader: How Lenovo Establishes Gateway Dominance in AI Implementation

Deep News
Feb 13

The Year of the Red Horse approaches in 2026. Four years after the AI frenzy swept the globe, the industry is quietly entering a new phase: shifting from the capital-intensive competition of model training to an era focused on commercialization efficiency. The emergence of ChatGPT in late 2022 ignited global enthusiasm for generative AI, prompting tech giants to invest hundreds of billions of dollars in building larger, more complex models. By early 2026, the focus of investors and corporate executives has clearly shifted. They are no longer satisfied with dazzling technology demonstrations but are asking a more practical question: when will these massive investments translate into substantial revenue and profits? In this transition, the hybrid AI architecture is gradually becoming the mainstream consensus. Hybrid AI combines the powerful capabilities of large cloud-based models with local computing on devices and edge equipment, aiming to embed AI into real business scenarios at lower costs and in a more controllable manner. Lenovo Group, as the world's largest supplier of personal computers and workstations, is standing out in this new phase by leveraging its vast installed base and end-to-end delivery capabilities. On the 12th, Lenovo announced its results for the third quarter of the 2025/26 fiscal year, ended December 31, 2025: Group revenue reached 157.5 billion yuan, a year-on-year increase of over 18%, significantly exceeding market consensus expectations and setting a new historical high. Adjusted net profit grew 36% year-on-year, surpassing market expectations by 32%, with profit growth reaching twice the rate of revenue growth, demonstrating Lenovo Group's resilience and certainty in delivering on its promises amidst a complex environment. Over the past quarter, the industry faced multiple challenges, including rapidly rising prices of key components like memory chips, intensified supply-demand fluctuations, and questions about the sustainability of the AI cycle. Lenovo not only fulfilled its previously stated commitment to the market of "double-digit revenue growth and profitability" but also, through its comprehensive advantages in supply chain management, operational capabilities, and overall resilience, responded directly to external concerns with a report card that "delivers on its promises." It can be said that Lenovo's hybrid AI strategy does not seek absolute leadership in a single technical metric but occupies the most advantageous structural position on the AI monetization path: proximity to real customers, stable shipment volumes, and continuous service capabilities.

After the capital recedes, AI must learn to account. To understand why hybrid AI and Lenovo are becoming the focus now, one must first examine the evolution of the AI industry over the past three to four years. Since the debut of ChatGPT, tech giants like Microsoft, Google, Amazon, and Meta have engaged in an unprecedented capital expenditure race. Hundreds of billions of dollars have been invested in building data centers and purchasing GPUs. The theme of this stage was staking a claim – whoever possesses the strongest model owns the future. However, by 2025, although model capabilities continued to improve, the law of diminishing marginal returns began to appear. For enterprise customers, while general-purpose large language models are impressive, the accompanying data privacy concerns, massive inference costs, and high latency have become the final obstacles to large-scale AI implementation. Investor patience is limited. More and more people are noticing a troubling mathematical problem: there is a huge gap between the current investment in AI infrastructure and the actual revenue generated by AI software. If AI cannot transform from an expensive toy into a productivity tool, this boom will be difficult to sustain. The focus of industry development is undergoing a dramatic shift: from Training to Inference. Training is the process of creating the AI brain; it is centralized, one-off, and extremely expensive. Inference is the process of using the AI brain; it is distributed, continuous, and extremely sensitive to cost. As AI enters the monetization phase, most computing demand will come from inference. In this stage, the key competitive metric is no longer who has more H100 GPUs, but who can enable users to call AI functions 100 times a day at the lowest cost and most convenient way. This is the fundamental reason for the shift in market sentiment. The pure cloud AI model faces unbearable bandwidth costs and energy bills when dealing with daily high-frequency calls from billions of users. The market urgently needs a more economical and practical solution. In this transition from dreaming in the cloud to prospecting on the ground, companies closest to customers, with physical delivery channels, hold the initiative in monetization.

Whoever controls the gateway holds the power of AI distribution. In the battle for AI monetization, owning the algorithm does not equal owning the revenue. History shows that the inventors of technology are often not the biggest commercial beneficiaries; the distributors and integrators of technology are. In the current AI business ecosystem, what kind of companies are most sought after? The answer is manufacturers that can cover a broad customer base, possess global supply chain resilience, and have scalable delivery capabilities. This is the core logic behind the revaluation of Lenovo Group. As the world's largest PC manufacturer, a leading server provider, and a giant with a complete portfolio of smart device products, Lenovo is not participating in a vanity game of "whose model has more parameters" but is building the physical network required for AI implementation. Imagine when a multinational bank decides to deploy AI assistants for its 50,000 employees. They will not merely purchase an API interface. They need 50,000 PCs capable of running AI models locally, require matching private edge servers to handle sensitive data, and need a complete set of services from hardware to underlying maintenance. This is Lenovo's moat. Lenovo has a massive global installed base of devices. This is not just hardware assets but also a natural gateway for AI. In the AI monetization phase, hardware is the platform. Every keyboard, every screen, every workstation is a touchpoint for AI to interact with the real world. Lenovo's vigorously promoted AI PC strategy is often misunderstood as a simple hardware upgrade, but it is actually a Trojan horse for AI普及 (AI popularization). By integrating local NPUs and heterogeneous computing power into PC endpoints, Lenovo effectively shifts inference costs from the cloud to the device side. For users, this means faster response times and privacy protection; for the industry, it means the cost of using AI is significantly diluted, making large-scale monetization possible. Under this logic, Lenovo is no longer a traditional hardware assembler but a computing power operator, transporting and storing expensive cloud computing power in hundreds of millions of endpoint devices, making it accessible. Whoever has the most stable shipment volume owns the most extensive AI real estate. In this dimension, Lenovo's global channel network and supply chain capabilities constitute barriers that are difficult for pure software companies to overcome. Looking at the financial report data, in the third quarter, Lenovo's AI PC revenue increased 39% year-on-year, with an average selling price of $845, significantly higher than the industry average, providing structural support for profitability. Lenovo is also continuously expanding personal intelligent gateways through its multi-device portfolio: by the end of the third quarter, PCs and smartphones each accounted for about half of the global activations of Lenovo and Motorola AI devices. As the super agent Qira is subsequently pre-installed and deployed at scale, the flywheel effect of Lenovo's personal intelligent ecosystem is expected to amplify further.

Hybrid AI: A Replicable Profit Model If we delve into the ultimate form of AI monetization, we find that hybrid AI is not a technical compromise but a commercial optimal solution. Currently, AI commercialization faces two deadlocks: first, cost, as cloud inference is too expensive; second, trust, as enterprises dare not transmit core data to the public cloud. The core of hybrid AI lies in tiered processing: simple, personal, and private tasks are handled on endpoint devices like phones and PCs; complex tasks requiring vast knowledge bases are handled in the cloud; and sensitive enterprise business processes are handled at the edge. The commercial implications of this architecture are profound, as it brings AI into a replicable profit model. For Lenovo, hybrid AI transforms its business from simply selling devices to selling capabilities. Through its full-stack layout of device-edge-cloud, Lenovo can provide customers with a turnkey AI solution. Unlike the binary outcome of startups – either becoming rich overnight through an IPO or losing everything – Lenovo plays the role of a steady-state beneficiary in the AI monetization phase. First, endogenous growth from the hardware cycle: The demand for endpoint computing power from AI is strongly driving the replacement cycle for PCs and servers. This is not conceptual hype but a rigid demand based on physical performance. The end of service for Windows 10, combined with the explosion of AI applications, will bring deterministic hardware revenue growth for Lenovo. Second, the blue ocean of edge computing: As AI enters factories, hospitals, and retail stores, the demand for edge computing servers is exploding. Lenovo's布局 (layout) in ISG (Infrastructure Solutions Group) positions it precisely at this growth point. Enterprises need AI in a box, not AI in the cloud, and this is Lenovo's area of expertise. Third, stickiness from devices to solutions: Hybrid AI is extremely complex, making it difficult for enterprises to build on their own. Through SSG (Solutions and Services Group), Lenovo provides the deployment, maintenance, and consulting for hybrid AI, transforming one-time hardware sales into continuous service revenue. This revenue stream is highly anti-cyclical and offers high profit margins. In the third quarter, Lenovo's AI-related revenue increased 72% year-on-year, accounting for 32% of the Group's total revenue. Within this, AI PC revenue grew 39% year-on-year, AI smartphone revenue surged 202% year-on-year, AI server revenue increased 59% year-on-year, and AI service revenue grew 127% year-on-year, with its proportion of SSG's total revenue increasing by 6.2 percentage points year-on-year, driving the continuous evolution of the Group's business structure towards higher value. Lenovo's hybrid AI strategy embodies a profound technological pragmatism. It does not seek to top a single large model leaderboard or release the coolest demo. It aims to be the company that lays the power grid, manufactures the meters, and ensures stable electricity delivery to every household when AI becomes an infrastructure like electricity. In business history, players in the middle, connecting the source of technology with end-users, often possess the most enduring vitality. Lenovo is becoming the weaver of the network in the AI era. In fact, when reviewing the history of technology, an interesting pattern emerges: technological explosions are often ignited by radical innovators, but the普及 (popularization) and commercialization benefits are often realized by steady giants. The AI industry is experiencing its Normandy landing moment. The air raid (large model training) is over; now it is time for the ground forces (endpoint and edge devices) to advance. In this stage, the market no longer needs more visionaries but executors who can translate AI technology into tangible revenue on financial statements. With its strategic focus on hybrid AI, globalized supply chain network, and deep understanding of enterprise customer pain points, Lenovo Group has proven itself not just a participant in the AI era, but a builder of this new commercial order. Lenovo's value lies not in whether it develops the smartest AI, but in its effort to turn AI into a business – a sustainable, scalable, and profitable business. For investors, in an era of AI泡沫 (bubble) filled with uncertainty, this certain monetization capability is the most scarce asset. Yang Yuanqing, Chairman and CEO of Lenovo Group, stated, "Looking ahead, as artificial intelligence deeply integrates into individuals' daily lives and enterprises' operational development, we will continue to promote hybrid artificial intelligence to seize the opportunities brought by the普惠 (inclusiveness) of AI, accelerate growth, further improve profitability, and create long-term sustainable tangible returns for shareholders."

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10