In the rapidly evolving world of enterprise artificial intelligence, a recent disclosure has underscored a persistent trend: despite the hype surrounding cost-effective open-source alternatives, big businesses continue to favor established large language models from tech giants. According to a report from The Information, Snowflake Inc., the cloud data platform, has quietly generated significant revenue from its AI offerings, with the bulk tied to integrations with premium models from OpenAI, Google, and Anthropic. This insight, drawn from internal figures not publicly detailed in Snowflake’s earnings calls, reveals that these “brand-name” LLMs are driving the lion’s share of enterprise AI spending, even as cheaper options proliferate.
The numbers paint a compelling picture. Snowflake’s AI-related products, including tools for data agents and machine learning workflows, contributed to a fiscal 2025 revenue of $3.5 billion, marking a 30% year-over-year increase, as highlighted in reports from WebProNews. Yet, it’s the undisclosed breakdown that steals the show: revenue from partnerships enabling access to proprietary LLMs far outpaces that from open-source counterparts. Insiders note that enterprises are willing to pay a premium for the perceived reliability and performance of models like GPT series from OpenAI or Gemini from Google, often citing concerns over data security and model consistency in mission-critical applications.
The Dominance of Proprietary Models in Enterprise AI
This preference isn’t merely anecdotal. Posts found on X, formerly Twitter, from industry analysts and investors echo the sentiment, with one prominent voice proclaiming Snowflake as the “top AI play of 2025” due to its enterprise data dominance and integrations with leading LLMs. Such integrations have propelled Snowflake’s net revenue retention rate to 124% in the first quarter of fiscal 2026, as reported by The Motley Fool, indicating that existing customers are not only sticking around but also expanding their spend on AI features. The company’s expanded partnership with Microsoft Azure, announced in a CNBC article, allows seamless access to OpenAI models directly within Snowflake’s ecosystem, further cementing this trend.
At the heart of Snowflake’s strategy is its AI Data Cloud, which facilitates the analysis of unstructured data and the creation of custom AI agents. A TechCrunch piece from 2024 detailed the launch of Snowflake’s own generative AI model, Arctic LLM, but recent developments show it’s the third-party brand-name models that are the real revenue engines. This aligns with findings from Snowflake’s presentations at the ACL 2025 conference, where research papers emphasized bridging gaps between LLMs and real-world enterprise challenges, such as text-to-SQL reliability and model evaluation—areas where proprietary models excel due to extensive fine-tuning and support ecosystems.
Revenue Growth Amid AI Investments
Snowflake’s fiscal performance underscores its positioning. In its Q4 2025 earnings, product revenue hit $943 million, up 28% year-over-year, per a post by Snowflake’s CEO Sridhar Ramaswamy on X. This growth is fueled by AI-driven demand, with the company boasting 542 customers generating over $1 million in trailing 12-month revenue, as noted in investor analyses on Medium. However, challenges loom: a rating downgrade covered by AInvest highlighted maturity crossroads, with revenue hitting $1.04 billion in Q4 but margins squeezed by heavy investments in AI infrastructure.
Competitors like Databricks and Oracle are mirroring this approach, striking deals to embed brand-name LLMs, but Snowflake’s multi-cloud governance gives it an edge, as discussed in engineering blogs from the company itself. Reuters reported back in 2023 that AI adoption was already boosting Snowflake’s offerings, a prophecy fulfilled in 2025’s figures. Yet, benchmarks from sources like WhisperTick on X suggest emerging threats from cost-efficient alternatives like ClickHouse, which claims faster queries at lower costs, potentially eroding Snowflake’s dominance if open-source models gain traction.
Implications for the Broader AI Ecosystem
Looking ahead, this revenue revelation from The Information signals a bifurcation in AI adoption. While startups and smaller firms experiment with open-source LLMs to cut costs, enterprises—representing Snowflake’s core 754 Forbes Global 2000 clients—prioritize scalability and compliance. A Medium executive briefing on Snowflake Intelligence, unveiled at the 2025 Summit as per CRN coverage, positions the company for conversational AI in enterprises, leveraging brand-name models for insightful data interactions.
The stock market has responded enthusiastically, with shares jumping 47% in three months, driven by AI optimism, according to The Globe and Mail. Analysts project continued growth, with AInvest noting an enterprise value-to-revenue ratio of 18.97x as of June 2025, justified by bold AI bets. However, valuation concerns persist; bridging them will require sustained innovation, such as the AI agents and warehouse advancements revealed at Snowflake Summit 2025.
Navigating Challenges and Future Prospects
Critics argue that over-reliance on third-party LLMs could expose Snowflake to risks like model pricing changes or competitive shifts. Yet, the company’s adaptive compute and AI governance features, praised in its own product pages, mitigate this by offering flexibility. Posts on X from data experts highlight Snowflake’s 35% market share in cloud data warehousing, positioning it to capitalize on the projected $15 billion to $76 billion AI market by 2030.
Ultimately, Snowflake’s secret AI revenue figure illuminates a truth: in the high-stakes arena of enterprise AI, brand-name LLMs reign supreme, driving real dollars and shaping strategic alliances. As the company innovates under CEO Ramaswamy’s leadership, it stands poised to redefine how businesses harness AI, blending data prowess with cutting-edge models for enduring competitive advantage.