Google's Gemma AI Models Hit 150M Downloads

Gemma AI Reaches Major Milestone: 150 Million Downloads

Google's family of openly available AI models, Gemma, has achieved a significant milestone, surpassing 150 million downloads since its introduction. This figure highlights the growing adoption and interest within the developer community.

The announcement came over the weekend via a post on X from Omar Sanseviero, a developer relations engineer at Google DeepMind. Sanseviero also revealed that the developer community has been actively building upon Gemma, creating over 70,000 model variants on the popular AI development platform, Hugging Face.

Gemma's Role in the Open Model Landscape

Launched in February 2024, Gemma was Google's strategic move into the openly available AI model space, aiming to compete with established families like Meta's Llama. The Gemma models are designed to be versatile; recent releases are multimodal, meaning they can process both text and images, and offer support for over 100 languages. Google has also developed specialized versions of Gemma, fine-tuned for specific tasks such as drug discovery research.

Market Context and Community Considerations

While achieving 150 million downloads in roughly a year is a notable accomplishment, Gemma still trails its primary competitor, Meta's Llama, which reported crossing the 1.2 billion download mark in late April 2025.

It's also worth mentioning that both Gemma and Llama, despite being termed "open," have faced scrutiny from some developers regarding their custom, non-standard licensing terms. Concerns have been raised about potential risks these licenses might pose for commercial applications.

References

Read more

Lex Proxima Studios LTD