New breed of search with AI assist combines vector database and keywords

Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More


The race to develop the best search engine technology heated up again this week with the launch Tuesday of Algolia NeuralSearch. Built on an AI engine, it combines vector database search with traditional keyword search.

Algolia’s system had previously largely relied on keyword relevancy, the default approach since search technology first appeared. According to Algolia, its existing technology was already delivering 1.5 trillion searches per year, powering site queries for over 17,000 organizations including Walgreens, Under Armour and Petsmart.

Then, in September 2022, Algolia acquired Search.io to expand its search capabilities, adding new vector search functionality.

The new NeuralSearch service is a combination of Algolia’s keyword technology with the Search.io vector technology. The melding of the two approaches delivers what the company claims is a new type of highly accurate search.

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.

Register Now

Vector databases are an increasingly important and common part of many AI deployments, providing a data repository optimized for machine learning operations and data queries.

“What we got from Search.io is vector search that has the natural language understanding, and it enables users to be able to search as we think,” Bernadette Nixon, CEO of Algolia, told VentureBeat. “So it’s inference, and it’s really making the jump from everything having to match a keyword, to really understanding the intent and the context behind what you’re looking for.”

Also Read : Microsoft introduces Xbox Game Pass’ new friend referral programme

Why merging vector database and keyword search is more than the sum of two parts

The market for vector database technology has become increasingly frothy in recent weeks as the AI landscape widens and demand spikes. Startup Pinecone, for example, announced at the end of April that it had raised $100 million in new funding.

Nixon explained that what Algolia has isn’t just a vector database. She emphasized that the company’s “secret sauce” is how it manages the vectors, which are mathematical embeddings, in a way that can accelerate performance for queries.

“We found a way of using them [vectors] for what they’re great at: getting at the meaning and understanding behind the search term,” Nixon said. “And then we’re basically compressing them in such a way as to maintain 100% of that fidelity, but cutting about 90% of the cost out.”

At the core of the “secret sauce” is an approach that Algolia calls neural hashing. This takes vector similarities in the database and distills them into a binary that can be stored in the Algolia index, to deliver a rapid response to queries.

The other part of the “secret sauce” is that Algolia can apply both keyword and vector search to every query and provide users with an optimized result.

A common challenge with site search based only on keywords is the potential for a user to get no results. Nixon said that early users of NeuralSearch have reported 70% reductions in queries returning zero results.

Going a step further, organizations have reported up to a 20% increase in conversion to revenue.

Coming soon: Vector databases for everyone

Algolia’s aspirations for its vector database technology go beyond search.

Nixon explained that what’s happening now is that if a company wants to use its own data for AI, as opposed to just whatever is in a large language model (LLM), more often than not it has to train and store data in a vector database.

The vector database can also keep track of all the questions it has been asked, as well as the answers, so that the user can easily retrieve the results without having to go back and recreate the prompt.

To date, organizations have mostly had to deploy vector databases alongside traditional SQL-based databases to power larger applications. Nixon hinted that Algolia is planning to change that paradigm in the near future with an upcoming release that will enable vector embeddings to work inside common SQL databases as just another data type.

“What we’re saying is, you don’t have to bother introducing another database; what we can do is vectorize that data stored in your database in just another field,” Nixon said. “You don’t have to implement a separate vector database, we can vectorize it, hash it and send it back to you.”

Originally appeared on: TheSpuzz

iSlumped