In the high-stakes world of artificial intelligence startups, Anthropic is rewriting the rules of venture capital fundraising amid unprecedented investor frenzy. The San Francisco-based company, known for its Claude AI models, is currently in the midst of a massive funding round that could value it at a staggering $170 billion, according to a recent report. This comes as venture capitalists scramble to secure a piece of the action, but Anthropic is exercising newfound leverage by imposing stringent conditions on participation.
Sources familiar with the matter indicate that the round, which has attracted overwhelming interest, is being led by prominent investor Menlo Ventures. The firm, an early backer of Anthropic, has a history of steering significant investments into the AI space, including a $750 million round in 2023 detailed in Reuters. This time, however, Anthropic is not just accepting capital—it’s dictating terms, particularly around special purpose vehicles (SPVs), which allow investors to pool funds for specific deals.
Amid surging demand from venture firms eager to bet on AI’s future, Anthropic’s selective approach to SPVs highlights a shift in power dynamics, where hot startups can afford to cherry-pick partners who align with their long-term vision and ethical standards.
This selectivity stems from Anthropic’s rising prominence in the enterprise AI market. A mid-2025 report from Menlo Ventures revealed that enterprise spending on large language models has doubled to $8.4 billion in just six months, with Anthropic capturing a leading 32% market share by usage—surpassing OpenAI’s 25%. As noted in TechCrunch, this reversal from two years ago, when OpenAI dominated with 50%, underscores Anthropic’s edge in areas like coding performance and safety features.
Investors are undeterred by the hurdles. SPVs, traditionally a flexible tool for VCs to rally limited partners for outsized bets, are now under scrutiny. Anthropic is reportedly limiting which SPVs can join, favoring those with strategic value over sheer financial muscle. This mirrors broader trends in venture capital, where SPVs have become more tactical, as highlighted in a LinkedIn post by industry expert Chris Harvey, who pointed to Menlo’s own $500 million SPV for Anthropic as a prime example.
By prioritizing investors who can offer more than money—such as expertise in scaling AI responsibly—Anthropic is positioning itself not just for growth, but for sustainable leadership in a sector fraught with regulatory and ethical challenges.
The company’s ascent is further evidenced by its consumer-facing initiatives. Menlo Ventures’ 2025 State of Consumer AI report, based on a survey of over 5,000 U.S. adults and available on their website, shows rapid adoption of AI tools, with Anthropic benefiting from its focus on trustworthy models. This consumer traction complements its enterprise dominance, as seen in partnerships like the funding of AI nutrition app Alma, backed by Menlo and Anthropic’s Anthology Fund, per Business Insider.
Critics argue that such selectivity could alienate potential allies, but insiders view it as a savvy move in an overheated market. With AI investments soaring—enterprise LLM spend alone hitting $13.8 billion last year according to Menlo’s 2024 report in GlobeNewswire—Anthropic’s strategy may set a precedent for how AI unicorns navigate capital influxes.
As the funding round progresses, the interplay between Anthropic’s market leadership and its fundraising tactics will likely influence how other AI firms approach investors, emphasizing quality over quantity in an era of exponential technological advancement.
Ultimately, this round isn’t just about valuation; it’s a testament to Anthropic’s confidence. By being choosy with SPVs and partners, the company is betting on a future where AI development prioritizes alignment with human values, a core tenet since its founding by former OpenAI executives. As the dust settles, venture circles will watch closely to see if this bold stance pays off in an industry where today’s darlings can quickly become tomorrow’s cautionary tales.