Monday, March 10, 2025

Snowflake introduces new enterprise-grade LLM

Author: Simon Rowley

Snowflake, a data cloud company, has announced the launch of Snowflake Arctic, a large language model (LLM) designed to be the most open, enterprise-grade LLM on the market.

With its unique Mixture-of-Experts (MoE) architecture, Arctic delivers top-tier intelligence with unparalleled efficiency at scale. It is optimised for complex enterprise workloads, topping several industry benchmarks across SQL code generation, instruction following, and more. In addition, Snowflake is releasing Arctic’s weights under an Apache 2.0 license and details of the research leading to how it was trained, setting a new openness standard for enterprise AI technology. The Snowflake Arctic LLM is a part of the Snowflake Arctic model family, a family of models built by Snowflake that also include the best practical text-embedding models for retrieval use cases.

“This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI,” says Sridhar Ramaswamy, CEO, Snowflake. “By delivering industry-leading intelligence and efficiency in a truly open way to the AI community, we are furthering the frontiers of what open source AI can do. Our research with Arctic will significantly enhance our capability to deliver reliable, efficient AI to our customers.”

According to a recent report by Forrester, approximately 46% of global enterprise AI decision-makers noted that they are leveraging existing open source LLMs to adopt generative AI as a part of their organisation’s AI strategy. With Snowflake as the data foundation to more than 9,400 companies and organisations around the world, it is empowering all users to leverage its data with open LLMs, while offering them flexibility and choice with what models they work with.

Now with the launch of Arctic, Snowflake is delivering a powerful, truly open model with an Apache 2.0 license that permits ungated personal, research, and commercial use. Taking it one step further, Snowflake also provides code templates, alongside flexible inference and training options, so users can quickly get started with deploying and customising Arctic using their preferred frameworks. These will include NVIDIA NIM with NVIDIA TensorRT-LLM, vLLM, and Hugging Face. For immediate use, Arctic is available for serverless inference in Snowflake Cortex, Snowflake’s fully managed service that offers machine learning and AI solutions in the data cloud.

It will also be available on Amazon Web Services (AWS), alongside other model gardens and catalogues, which will include Hugging Face, Lamini, Microsoft Azure, NVIDIA API catalogue, Perplexity, Together AI, and more.



Related Posts

Next Post
Translate »