Exponential growth brews 1 million AI models on Hugging Face

0
16


On Thursday, AI internet hosting platform Hugging Face surpassed 1 million AI mannequin listings for the primary time, marking a milestone within the quickly increasing area of machine studying. An AI mannequin is a pc program (typically utilizing a neural community) skilled on knowledge to carry out particular duties or make predictions. The platform, which began as a chatbot app in 2016 earlier than pivoting to develop into an open supply hub for AI fashions in 2020, now hosts a wide selection of instruments for builders and researchers.

The machine-learning area represents a far greater world than simply massive language fashions (LLMs) like the type that energy ChatGPT. In a submit on X, Hugging Face CEO Clément Delangue wrote about how his firm hosts many high-profile AI fashions, like “Llama, Gemma, Phi, Flux, Mistral, Starcoder, Qwen, Steady diffusion, Grok, Whisper, Olmo, Command, Zephyr, OpenELM, Jamba, Yi,” but in addition “999,984 others.”

The rationale why, Delangue says, stems from customization. “Opposite to the ‘1 mannequin to rule all of them’ fallacy,” he wrote, “smaller specialised custom-made optimized fashions on your use-case, your area, your language, your {hardware} and customarily your constraints are higher. As a matter of truth, one thing that few individuals understand is that there are nearly as many fashions on Hugging Face which can be non-public solely to 1 group – for firms to construct AI privately, particularly for his or her use-cases.”

A Hugging Face-supplied chart showing the number of AI models added to Hugging Face over time, month to month.
Enlarge / A Hugging Face-supplied chart exhibiting the variety of AI fashions added to Hugging Face over time, month to month.

Hugging Face’s transformation into a serious AI platform follows the accelerating tempo of AI analysis and improvement throughout the tech business. In only a few years, the variety of fashions hosted on the location has grown dramatically together with curiosity within the area. On X, Hugging Face product engineer Caleb Fahlgren posted a chart of fashions created every month on the platform (and a link to different charts), saying, “Fashions are going exponential month over month and September is not even over but.”

The ability of fine-tuning

As hinted by Delangue above, the sheer variety of fashions on the platform stems from the collaborative nature of the platform and the observe of fine-tuning current fashions for particular duties. Wonderful-tuning means taking an current mannequin and giving it extra coaching so as to add new ideas to its neural community and alter the way it produces outputs. Builders and researchers from world wide contribute their outcomes, resulting in a big ecosystem.

For instance, the platform hosts many variations of Meta’s open-weights Llama models that signify totally different fine-tuned variations of the unique base fashions, every optimized for particular purposes.

Hugging Face’s repository consists of fashions for a variety of duties. Searching its models page exhibits classes corresponding to image-to-text, visible query answering, and doc query answering below the “Multimodal” part. Within the “Laptop Imaginative and prescient” class, there are sub-categories for depth estimation, object detection, and picture technology, amongst others. Pure language processing duties like textual content classification and query answering are additionally represented, together with audio, tabular, and reinforcement studying (RL) fashions.

A screenshot of the Hugging Face models page captured on September 26, 2024.
Enlarge / A screenshot of the Hugging Face fashions web page captured on September 26, 2024.

Hugging Face

When sorted for “most downloads,” the Hugging Face fashions checklist reveals traits about which AI fashions individuals discover most helpful. On the high with a large lead at 163 million downloads is Audio Spectrogram Transformer from MIT, which classifies audio content material like speech, music, environmental sounds. Following that with 54.2 million downloads is BERT from Google, an AI language mannequin that learns to grasp English by predicting masked phrases and sentence relationships, enabling it to help with numerous language duties.

Rounding out the highest 5 AI fashions are all-MiniLM-L6-v2 (which maps sentences and paragraphs to 384-dimensional dense vector representations, helpful for semantic search), Vision Transformer (which processes photographs as sequences of patches to carry out picture classification), and OpenAI’s CLIP (which connects photographs and textual content, permitting it to categorise or describe visible content material utilizing pure language).

It doesn’t matter what the mannequin or the duty, the platform simply retains rising. “In the present day a brand new repository (mannequin, dataset or house) is created each 10 seconds on HF,” wrote Delangue. “Finally, there’s going to be as many fashions as code repositories and we’ll be right here for it!”



Source link