Google DeepMind announced today the release of the 2 billion (2B) parameter version of Gemma 2, the second generation of its Gemma AI models. First launched in February this year, Gemma is a family of lightweight, text-to-text open models designed for developers and researchers — and built on the technology that powers Google Gemini. DeepMind released Gemma 2 in June, in two different sizes: 9 billion (9B) and 27 billion (27) parameters. The new 2B model learns from larger models through distillation and produces outsized results, DeepMind says. The company also claims that it outperforms all GPT-3.5 models on the…
This story continues at The Next Web
https://thenextweb.com/news/google-deepmind-2b-parameter-gemma-2-model
созданный
7mo
|
31 июл. 2024 г., 16:40:04
Войдите, чтобы добавить комментарий
Другие сообщения в этой группе





