A new open-source AI that's shaking up the giants
On May 28, 2025, DeepSeek, an emerging AI player in China, unveiled DeepSeek R1-0528, a powerful update to its open-source model, which aims to compete with top-performing proprietary models such as GPT-4.5 (OpenAI) and Gemini 2.5 Pro (Google). Based on a Mixture-of-Experts (MoE) architecture, the R1-0528 model features 236 billion parameters, with routing that activates 13 billion per query, enabling efficiency comparable to large models while keeping inference costs under control1.
This version was pre-trained on a massive corpus of 6 trillion tokens, including multilingual, scientific, and technical data. DeepSeek is clearly committed to a powerful, high-performance, and open-source foundational AI that can be adapted for vertical, industrial, or educational applications.
Technical performance: promising results
DeepSeek R1-0528 stands out in several public benchmarks, including:
- MMLU (Massive Multitask Language Understanding): 78.1% (development), comparable to GPT-4.
- GSM8K (mathematical problem-solving): 91.0%, outperforming Claude 3 and Gemini 1.5.
- HumanEval (Python code problem-solving): 88.7%, outperforming Mistral Large2.
These results demonstrate a high level of technical sophistication for an open-source model, which is often perceived as less capable than proprietary solutions. By leveraging the MoE architecture, DeepSeek optimizes adaptability, scalability, and inference speed.
Identified use cases for open-source AI
The flexibility of the DeepSeek R1-0528 opens up new possibilities in several sectors:
- Education: creation of educational content in Chinese, English, and French; solutions to technical or math problems.
- Finance: report analysis, automated generation of industry summaries.
- Industry: Support for predictive maintenance through integration with IoT systems.
- R&D: AI-assisted research in scientific literature.
Startups in Asia and Europe are already testing the model on cloud platforms such as Hugging Face and vLLM, where it is available for free download.
Open source as a driver of technological sovereignty
The open-source release of DeepSeek R1-0528 is part of a broader strategy for technological self-reliance. Against a backdrop of AI models being concentrated in the hands of a few major U.S. companies, this type of initiative strengthens the sovereign and interoperable development capabilities of public and private sector actors.
Like Mistral AI and Falcon LLM, DeepSeek contributes to an ecosystem in which open standards and transparent models are set to play a central role in the democratization and regulation of AI.
Toward Responsible, High-Performance, and Collaborative AI
DeepSeek reaffirms its commitment to responsible AI:
- The R1-0528 model is open-source(source code and weighting factors available),
- safety filters are built in to prevent generation drift,
- Community monitoring is provided alongside the release via GitHub and Hugging Face.
Rather than pitting innovation against regulation, DeepSeek advocates for open cooperation, in which technological performance operates within a transparent, well-documented framework that complies with the growing requirements of the European AI Act3.
Healthy competition that drives the global AI ecosystem
The arrival of DeepSeek R1-0528 reflects a broader trend: the rise of high-performance open-source models. Far from undermining the industry giants, these alternatives foster innovation, healthy competition, and the acceleration of industry standards.
As Meta prepares LLaMA 3 and Mistral announces a future multilingual MoE, open source is emerging as a credible and strategic path for companies seeking to integrate generative AI into their processes while maintaining control and transparency.
References
1. DeepSeek. (2025).
https://github.com/deepseek-ai/DeepSeek-V2
2. DeepSeek. (2025).
https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard
3. European Commission. (2024).
https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai

