DeepSeek Rolls Out New Model Capable of Processing Extremely Long Texts in One Go

Deep News
2 hours ago

On February 11th, multiple users reported that DeepSeek has updated its version on both web and app platforms, now supporting context lengths of up to 1 million tokens. This marks a significant increase from the 128K context length available in DeepSeek V3.1, which was released last August.

In practical tests, DeepSeek confirmed during interactions that it supports 1 million tokens of context, enabling it to process extremely long texts in a single session. When provided with the entire novel "Jane Eyre," a document exceeding 240,000 tokens, DeepSeek successfully recognized and processed the content.

Previously, informed sources suggested that DeepSeek was more likely to release minor updates to its V3 series models during the Spring Festival period. However, the same sources indicated that the main event is still upcoming. DeepSeek's next-generation flagship model is expected to be a foundational model with trillions of parameters. Due to this substantial scale increase, training speeds have noticeably slowed, resulting in delays to the release timeline.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10