DeepSeek-V3 Unveiled: 236B Parameters and 128K Context for Free
DeepSeek v.3
1 września 2025Author: Łukasz Grochal

The DeepSeek-V3 represents a significant evolutionary leap from its predecessor, the DeepSeek-V2. The most notable upgrade is the substantial expansion in scale, with the model's parameters increasing dramatically to a massive 236 billion. This enhanced architecture directly contributes to its superior performance across a wide array of benchmarks, including coding, mathematics, and general reasoning tasks. A key improvement is the extension of its context window, which now supports up to 128,000 tokens.

This allows the model to process and comprehend extremely long documents, complex technical papers, or maintain context over extended conversations far more effectively than previous versions. Furthermore, while retaining its strong text-based capabilities, the V3 iteration is architected for advanced multimodal understanding, laying the groundwork for future functionalities.

Perhaps the most impactful change is its accessibility strategy; unlike many competitors, this more powerful model is being offered to the public for free, marking a bold move to democratize access to cutting-edge AI.