Neeva, the company that tried to challenge Google’s search engine dominance, is exiting consumer search.
Google holds a stranglehold on the search market, a stranglehold that Neeva attempted to disrupt. Despite developing a promising product and business model, the company has thrown in the towel, acknowledging the challenges involved in creating a search engine from scratch.
“Building search engines is hard,” writes co-founders Sridhar Ramaswamy and Vivek Raghunathan. “It is even harder to do with a tiny team of 50 people who are up against entrenched organizations with endless resources. We overcame these obstacles and built a search stack from the ground up, running a crawl that fetched petabytes of information from the web and used that to power an independent search stack.”
Despite the challenges of creating a search engine, the two founders saying convincing people to use it was even harder.
“But throughout this journey, we’ve discovered that it is one thing to build a search engine, and an entirely different thing to convince regular users of the need to switch to a better choice,” the two founders continue. “From the unnecessary friction required to change default search settings, to the challenges in helping people understand the difference between a search engine and a browser, acquiring users has been really hard. Contrary to popular belief, convincing users to pay for a better experience was actually a less difficult problem compared to getting them to try a new search engine in the first place.”
The founders say they will be shutting down neeva.com in the next few weeks, along with their consumer search engine. Instead, the company will pivot its focus to its LLM, a path it began exploring in 2022. In fact, Neeva says it was “the first search engine to provide cited, real-time AI answers to a majority of queries early this year.” The company will explore ways of building on this work.
“Over the past year, we’ve seen the clear, pressing need to use LLMs effectively, inexpensively, safely, and responsibly,” the founders add. “Many of the techniques we have pioneered with small models, size reduction, latency reduction, and inexpensive deployment are the elements that enterprises really want, and need, today. We are actively exploring how we can apply our search and LLM expertise in these settings, and we will provide updates on the future of our work and our team in the next few weeks.”