NVIDIA CEO Declares Dominance in AI Inference Performance

Deep News
Mar 17

At the GTC 2026 keynote, Jensen Huang showcased a championship belt emblazoned with the evaluation result from third-party chip analysis firm SemiAnalysis: "InferenceMAX KING." This title was claimed by NVIDIA for its GB300 NVL72 system, signifying it as the most powerful in inference performance.

The data originates from independent benchmarks conducted by SemiAnalysis. Measured in tokens per watt, the GB300 NVL72 outperforms competitors by 50 times. When assessed by cost per token, it is 35 times more economical than rival solutions. On stage, Huang corrected NVIDIA's previously announced efficiency improvement figure of 30 times, stating, "The actual number is 50 times."

The GB300 NVL72 is the flagship inference configuration based on the Blackwell Ultra architecture. It connects 72 GPUs via NVLink 6, providing a total system bandwidth of 260 TB/s. Huang emphasized that inference efficiency directly determines the revenue of AI factories, making it the most critical performance metric currently. The GB300 series is already in delivery, with the next-generation Vera Rubin architecture anticipated to enter mass production in 2027.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10