Jin10 Data July 9th news, early this morning, the globally renowned large model open source platform Hugging Face open sourced the top small parameter model SmolLM3. SmolLM3 has only 3 billion parameters, but its performance significantly surpasses similar open source models like Llama-3.2-3B and Qwen2.5-3B. It features a 128k context window and supports six languages including English, French, Spanish, and German. It supports both depth thinking and non-thinking dual reasoning modes, allowing users to switch flexibly.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Hugging Face's open-source top-of-the-line model
Jin10 Data July 9th news, early this morning, the globally renowned large model open source platform Hugging Face open sourced the top small parameter model SmolLM3. SmolLM3 has only 3 billion parameters, but its performance significantly surpasses similar open source models like Llama-3.2-3B and Qwen2.5-3B. It features a 128k context window and supports six languages including English, French, Spanish, and German. It supports both depth thinking and non-thinking dual reasoning modes, allowing users to switch flexibly.