Google Open Source Gemma-3: Comparable to DeepSeek, Computing Power Plummets

robot
Abstract generation in progress

Jinshi data news on March 13th, last night, Google (GOOG.O) CEO Sundar Pichai announced that the latest multimodal large model GEMMA-3 open source, featuring low cost and high performance. Gemma-3 has four sets of parameters: 1 billion, 4 billion, 12 billion, and 27 billion. Even with the largest 27 billion parameters, only one H100 is needed for efficient inference, which is at least 10 times more Computing Power efficient than similar models to achieve this effect, making it the currently most powerful small parameter model. According to blind test LMSYS ChatbotArena data, Gemma-3 ranks second only to DeepSeek's R1-671B, higher than OpenAI's o3-mini, Llama3-405B, and other well-known models.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)