Revised title

Stock News
Feb 20

CICC has initiated coverage on KNOWLEDGE ATLAS (02513) with an Outperform rating and a target price of HK$688. The report forecasts the company's revenue for 2025-2028 to be RMB 7.2 billion, RMB 17.5 billion, RMB 36.7 billion, and RMB 76.7 billion, respectively, representing a compound annual growth rate (CAGR) of 120%. The firm is optimistic about KNOWLEDGE ATLAS's iterative improvements in its foundational model capabilities and the monetization potential in coding applications.

Valuing the company against leading overseas large language model firms, CICC applied a 2028 price-to-sales (P/S) multiple of 40x with a long-term discount rate of 7% to arrive at the target price. The stock currently trades at a 2028 P/S multiple of 25x, implying a potential upside of 42% from the current share price. KNOWLEDGE ATLAS is recognized as a leading large model developer in China and a world-class player in the field.

Key points from the report are as follows:

**Strong Technical Foundation and Model Capabilities as the Cornerstone** Originating from Tsinghua University's research achievements, the company was founded in 2019 and has focused on deepening the development of its GLM series of foundational models, honing core capabilities in areas such as coding, reasoning, and agentic AI. Its latest generation foundational model, GLM-5, has achieved state-of-the-art (SOTA) levels on multiple benchmarks, including HLE and SWE, and has received widespread positive feedback from users both domestically and internationally.

**Commercial Value Realization, Empowering Numerous Industries** The company delivers its model capabilities through its Model-as-a-Service (MaaS) platform. Revenue from API services is expected to become the primary growth engine. It is estimated that by early 2026, the annual recurring revenue (ARR) related to APIs will be close to RMB 6 billion, representing a significant increase compared to the previous year. Simultaneously, the company serves a wide range of industries, including internet, software, and semiconductors, focusing on releasing model value to help clients enhance productivity.

**AI Coding TAM Potentially Reaching Trillions; KNOWLEDGE ATROS Holds a Leading Position** As one of the fastest-adopting AI application scenarios, AI coding is still in its early stages of penetration. The total addressable market (TAM) is estimated to reach a scale of trillions of yuan. KNOWLEDGE ATLAS has concentrated on refining its capabilities for the coding scenario, possessing core advantages such as low hallucination rates, high stability, and powerful reasoning and tool-use abilities. It is well-positioned to maintain leadership in AI coding and extend its reach into more enterprise scenarios.

**Key Differentiator from Market Consensus** The report highlights a more optimistic view on KNOWLEDGE ATLAS's ability to export its foundational model capabilities and sustain its leading position in the AI coding domain, potentially attracting a broader domestic and international customer base.

**Potential Catalysts** Catalysts include the release of a new generation model and high growth in API and coding-related ARR.

**Risks** Risks involve R&D progress falling short of expectations, slower-than-anticipated commercial expansion, intellectual property-related risks, funding shortages, intensifying competition, and data security risks.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10