Home News Is Generative AI Having Its IPhone Moment?

Is Generative AI Having Its IPhone Moment?

by admin

Small language models continue to gain traction among enterprises adopting generative AI for bespoke use cases.

Models that provide the cognitive horsepower behind generative AI applications come in all shapes and sizes. From large language models (LLMs) trained on billions of parameters to small language models (SLMs) trained on a fraction of those figures, there is a model for virtually every use case.

Yet despite all the bells and whistles that accompany frontier models—large-scale systems on the bleeding edge of AI’s cognitive boundaries—it’s the SLMs that are proliferating across enterprises. Look no further than the announcement from AI hosting platform HuggingFace that its portal recently eclipsed 1 million model listings.

The news is significant for a few reasons. In addition to showcasing the surplus of choices customers have at their disposal for models of varying sizes and capabilities, it underscores GenAI’s momentum at a time when some are questioning the value and ROI the technology brings to businesses.

Yet it also demonstrates the rising popularity of smaller, custom models among businesses pursuing targeted GenAI use cases. As HuggingFace CEO Clement Delange stated on X:

“Because contrary to the ‘1 model to rule them all’ fallacy, smaller specialized customized optimized models for your use-case, your domain, your language, your hardware and generally your constraints are better. As a matter of fact, something that few people realize is that there are almost as many models on Hugging Face that are private only to one organization – for companies to build AI privately, specifically for their use-cases.”

Why does this matter? Smaller, customized models provide customers the freedom of choice to pursue the use cases they want, while enabling them to keep their data and IP close and curb costs.

Smaller models also afford organizations more flexibility in running GenAI applications in their own datacenters or even extending to the edge of their network—on AI PCs and smartphones.

There’s an AI For That

If the feeling this news drop evokes feelings of deja vu, it might because it recalls the statistical milestones Apple touted for its App Store. The App Store surpassed 1 million mobile application downloads in 2013, hitting a critical mass flashpoint that demonstrated how pervasive the iPhone was becoming among consumers worldwide.

HuggingFace’s crossing of the 1 million models listed threshold suggests that GenAI is having its iPhone moment. Reasonable minds may debate whether the corollary between mobile apps and models is fair and accurate. But they’d be missing the macro point.

More interesting is that it’s largely the smaller models and not the frontier models that are proliferating across the enterprise. It suggests that organizations are leveraging a broad mix of models as they pursue private use cases, such as digital assistants, creating sales and marketing collateral or automating code creation.

Multimodal Models Point to a Bright Future

Emerging models provide powerful options. Exhibit A is Meta’s Llama 3.2, a suite of open multimodal models that can process text and images. Llama 3.2 includes 11 billion and 90 billion parameter small and medium-sized vision models that can interpret charts and graphs and caption images, as well 1B and 3B text-only models that can run on edge and mobile environments, including PCs.

When combined with retrieval-augmented generation (RAG), the Llama 3.2 models provide organizations a powerful way to refine their model results as they pursue their GenAI initiatives from the comfort of their own datacenters. Today most organizations prefer to run their AI models on-premises, with the capability to burst to the edge, because it affords them the opportunity to control sensitive data and IP, meet compliance mandates and control costs.

However exciting HuggingFace and Meta’s GenAI developments may be for the AI industry at large, neither model size nor capabilities will make or break your GenAI initiatives.

Harnessing value from GenAI requires organizational changes to the way we all work from daily practices to entire processes and workflows. This is especially important as the business world moves closer to adopting agents, the GenAI-fueled digital assistants that work autonomously to achieve goals.

Managing change will be paramount throughout, requiring close collaboration between IT, HR and other lines of business to shepherd organizations through their GenAI journeys. Fortunately, organizations needn’t navigate the journey alone.

Dell Technologies is working closely with both Meta and HuggingFace, both vital cogs in the vibrant ecosystem of vendors working to meet customers’ GenAI needs.

Learn more about the Dell AI Factory.

You may also like

Leave a Comment