DEV Community

Cover image for Search will be the future of LLM and AI Applications.
Saurabh Rai
Saurabh Rai

Posted on

Search will be the future of LLM and AI Applications.

While checking LinkedIn, an interesting article came onto my feed. It was titled “Every LLM Company is a Search Company, and Search is Hard: The Future of LLM Retrieval Systems.”

This is interesting because we’ve built SWIRL for the past few years to help scale companies’ search and AI infrastructure. Search is everywhere, as we’ve mentioned. DEVTO is now using Algolia (maybe earlier, their code was used to perform the search) to implement intelligent search. As anyone who's tried to build search-enabled applications knows, search is complex, involving many factors like relevance ranking and personalization.

This makes building search hard. This is for those who’ve tried to build search-enabled applications. Search is hard to manage, and there are a lot of factors to consider. Take Google, the leader in search engine AI, faces criticism over search results. People have started to complain about its search results. But we’re just not talking about only Google. There is a lot to understand here.

What about the data inside your organization? The place where you work?

Businesses are sitting on a lot of data scattered across various departments and channels – from customer interactions and sales records to operational logs and employee feedback.
They recognize the immense potential of leveraging this data to build AI-powered applications that streamline operations, enhance customer experiences, and drive innovation. However, the challenge lies in unifying this fragmented data landscape.

data inside your organization

The data is filled with a lot of information:

  • Meeting notes
  • Siloed applications
  • Email archives
  • Internal documentation
  • Customer support tickets
  • Project management tools

The challenge becomes in retrieving that information. The way we’re currently doing it is to stream all that data into a vector database. And then using that to retrieve information. This is simple and hard at the same time. Simple because you know where to search. It is hard because it manages those vector databases (or any search index). You have to:

  • Constantly seek data.
  • Compare the indexed and updated data.
  • Update if necessary.

On top of that, if you’re using a vector database, you’ll be dealing with embeddings. But that’s a story for another blog.

So why is search the future of LLM applications?

The vast amount of data we generate is becoming increasingly unmanageable. Traditional methods are expensive and time-consuming, with some security risks.

When your LLMs can search information that is inside your data repository. They can provide the information you and your team require without searching inside the training memory or hallucinating if it doesn’t exist. By providing the context of what you’re asking for, it becomes looking for a book and then answering.

Where should our focus be?

To develop AI applications that can perform well on internal data, we need an efficient retrieval-augmented generation (RAG) method. In this case, the retriever is a search platform.

The properties should be:

  • It should connect to internal data sources.
  • Be secure, respect, and follow the current and existing security solutions.
  • Understand context in human language.
  • Provide excellent and relevant results.

That’s how we’re building SWIRL: an efficient retriever that provides good answers, connects with multiple data providers, and allows AI in the enterprise.

So, that’s one job is done. You have the data. All you have to do is configure a data provider. Create a query and get the answers. Once you have the data that you want, in real time, the opportunities are endless:

  • Get AI Summaries.
  • Use that data to make reports with the help of AI.
  • Build a co-pilot that can help you assist you in your task. Once you have solved the retrieval part, you can do many things. SWIRL works on self-hosted platforms, so your data is still secure.

See what can we do with SWIRL?

I can search the sources and get you the data. The rest is just a way to orchestrate things. Broadly speaking, it’s simple and secure.
I want to give you an idea of the architecture. Check this diagram. It demonstrates how we’re doing this with the AI Infrastructure Platform.

SWIRL AI Connect Architechture

That’s where the best thing is: search your data in real-time.
The game changer here is that we should opt for a framework that allows us to get data from multiple applications without restricting the end user. Then, enable LLMs to assist you in performing your tasks.

And one more point: the best data isn’t always in the SQL database. It can be in your team’s room, meeting notes, or some other docs you’ve saved. The above architecture allows you to find and get good results that make you feel like you’re the one writing them.
A search, not just a semantic search but a general search architecture, is required to diminish the information gap we want from AI.

Join us in building an AI Infrastructure Software, SWIRL

Join SWIRL Community

SWIRL is open-source and built in Python. We’re developing many good stuff, from search to Chat interfaces. If you’re a UI pro, help us out!!

Join our Slack.

Watch this video on how to set up SWIRL. Then, join our Slack community and ping me. Let’s build an amazing open-source AI platform together. There might be some amazing rewards for you. 💐🎁

Give us a 🌟 on GitHub.

Top comments (9)

Collapse
 
srbhr profile image
Saurabh Rai

Oh, and if someone's wondering. Here's how we're beating Google search ranking with our Reader LLM: 👇

Collapse
 
jakepage91 profile image
Jake Page

Super insightful and year, to enable users to effectively search their own data is so key. Nice article!

Collapse
 
srbhr profile image
Saurabh Rai

Thanks a lot, Jake. 💖
We all need to search. : )

Collapse
 
chrisx001001 profile image
ChrisX001001

Can I join the team, please? I am unemployed and this would be my dream job..

Collapse
 
srbhr profile image
Saurabh Rai

Hi @chrisx001001, please join our Slack and send a text there. We aren't hiring right now, but we will keep you updated.

Collapse
 
marcelomazza profile image
Marcelo Mazza

congrats! It looks like a great initiative, specially since it's Open Source!

Collapse
 
srbhr profile image
Saurabh Rai

Thanks, @marcelomazza; a lot of great AI and AI Infrastructure Software is open source.

Collapse
 
fernandezbaptiste profile image
Bap

Very interesting read! Thanks for the write-up 🙂

Collapse
 
srbhr profile image
Saurabh Rai

I'm glad that you liked it! 🔥