Google DeepMind’s open-source AI models known as Gemma have recently passed a significant milestone, crossing over 150 million downloads since their debut. Omar Sanseviero, a developer relations engineer at DeepMind, revealed these impressive figures over the weekend, highlighting that the developer community has already produced more than 70,000 custom Gemma variants through the Hugging Face platform.
Originally introduced in February 2024, Google’s Gemma was designed to compete head-to-head with other prominent open-model families such as Meta’s Llama. Recent updates have expanded Gemma into a multimodal system capable of handling both text and images, and it now supports upwards of 100 different languages. Google has also refined specialized Gemma versions specifically tailored for unique tasks such as drug discovery.
Despite Gemma’s rapid growth and impressive adoption, it still significantly trails its primary competitor, Meta’s Llama, which surpassed a remarkable 1.2 billion downloads in late April.
Gemma, much like Llama, has attracted some criticism from the developer community primarily due to its unique licensing structure. Certain developers have expressed concerns that the custom, non-standard licensing conditions might complicate or potentially discourage commercial use.