On April 16, a deputy head of the National Bureau of Statistics stated at a press conference that China has achieved phased breakthroughs in the commercial and large-scale application of artificial intelligence. By this March, the average daily token calls had surpassed 140 trillion, marking an increase of over 40% compared to the end of 2025.
As the smallest unit for processing language and images in large AI models, tokens serve as the core bridge connecting raw data with AI applications. The volume of token calls directly reflects the frequency, scope, and depth of AI model deployment, making it a key indicator of the AI industry's vitality.
From an initial daily average of 100 billion calls at the start of 2024 to the current 140 trillion, token usage has surged nearly 1,400-fold in just two years. This explosive growth is driven by the deep integration of AI applications across various sectors, including education, agriculture, industry, and services.
From an industrial development perspective, the large-scale use of tokens is continuously driving rapid iteration and upgrades of AI models. This, in turn, is compelling improvements in supporting industries such as data governance, labeling, and circulation. It is facilitating the transition of artificial intelligence from general-purpose demonstrations to specialized implementations, injecting strong momentum into the digital and intelligent transformation of traditional industries.
Economically, tokens function as measurable, transferable, and priceable data carriers. They effectively address the challenges of fragmented and hard-to-quantify raw data, making it possible to measure and trade the value of data elements. This opens new pathways for the market-based allocation of data resources and vigorously promotes the deep integration of the digital economy with the real economy.
Notably, an implementation plan for advancing the construction of high-quality industry datasets, released by the National Data Administration on April 15, explicitly proposes "exploring new transaction models such as token trading and building a quantifiable and priceable dataset value system based on tokens." This top-level design further confirms the important role of tokens in the market-based allocation of data elements and charts a course for their future development.
More practically, AI applications driven by tokens are continuously optimizing production processes and enhancing service efficiency. From precise auxiliary diagnostics in healthcare to intelligent crop management in agriculture, and the intelligent upgrade of social governance, tokens are tangibly improving the quality of public services through AI, making societal operations more efficient and refined.
Of course, the explosive growth in token calls also imposes higher demands on industrial standards, data governance, and security safeguards. To ensure the sustained release of token value, it is crucial not only to maintain stable volume but also to focus on quality improvement. Specifically, sustained efforts are needed in the following four areas.
First, enhance token quality. Just as high-quality ingredients determine the quality of a dish, high-quality tokens are a prerequisite for reliable AI applications. The National Data Administration's plan, which includes special campaigns like "labeling攻坚" and "quality and efficiency improvement" focused on building high-quality datasets in key sectors, is a critical measure to address the current issue of uneven token quality.
Second, rapidly establish smooth data circulation channels. Currently, data fragmentation across departments and industries—often referred to as "data silos"—persists. There is an urgent need to build secure and standardized circulation platforms, improve transfer rules, and promote the compliant flow and efficient allocation of tokens across various fields.
Third, establish a sound token pricing mechanism. The value of tokens should be closely linked to their quality, scarcity, and application scenarios. By accelerating the development of a scientific and fair pricing system, tokens can become truly tradable and value-appreciating assets, fully stimulating market vitality.
Fourth, balance innovation with security. On one hand, continuous investment in R&D for technologies like token processing and model optimization is needed to enhance application efficiency. On the other hand, it is essential to improve laws, regulations, and supervisory systems to standardize the entire lifecycle of token collection, processing, circulation, and trading, firmly upholding the bottom line of data security and privacy protection.
The explosive growth in token usage presents a new challenge concerning data governance and value extraction. Looking ahead, as the construction of high-quality datasets accelerates and the mechanisms for token circulation and pricing continue to improve, tokens will persistently unleash a multiplier effect. This will drive the deep integration of AI technology with the real economy, aiding China's steady progression from a major AI application country to a leading AI innovation powerhouse.