DeepSeek
DeepSeek was founded in 2023 with a focus on researching world-leading fundamental models and technologies for artificial general intelligence (AGI), tackling cutting-edge challenges in AI. Leveraging self-developed training frameworks, proprietary AI computing clusters, and tens of thousands of computing units, the DeepSeek team released and open-sourced multiple billion-parameter large models within just half a year. These include the general-purpose large language model DeepSeek-LLM and the code-focused model DeepSeek-Coder. In January 2024, DeepSeek also pioneered the open-sourcing of China's first Mixture of Experts (MoE) large model, DeepSeek-MoE. All these models have demonstrated outstanding performance, surpassing comparable models in both public benchmark evaluations and real-world generalization tasks beyond their training data. Chat with DeepSeek AI and easily access its API.