Google's Gemma AI Models Reach 150 Million Downloads
Google's open-source Gemma AI models have achieved a significant milestone, surpassing 150 million downloads. Omar Sanseviero, a developer relations engineer at Google DeepMind, announced the impressive figure on X (formerly Twitter). He also revealed that developers have created over 70,000 variations of Gemma on the popular AI development platform, Hugging Face.
Gemma just passed 150 million downloads and over 70k variants on Hugging Face. What would you like to see in the next Gemma versions?
— Omar Sanseviero (@osanseviero)
Launched in February 2024, Gemma aims to compete with other open-source model families like Meta's Llama. The latest Gemma releases are multimodal, capable of processing both images and text, and support over 100 languages. Google has also developed specialized Gemma versions fine-tuned for specific applications, including drug discovery.
Gemma vs. Llama: The Download Race
While 150 million downloads is a substantial achievement, Gemma still lags behind its main competitor, Llama. Meta's Llama surpassed 1.2 billion downloads in April 2025.
It's worth noting that both Gemma and Llama have faced criticism for their custom licensing terms. Some developers have expressed concerns about the potential risks associated with using these models commercially.
Despite the licensing concerns, Gemma's growing download numbers demonstrate its increasing popularity within the AI community. The future development of Gemma and its continued competition with Llama will be an interesting trend to follow.