SB-AI (Select Better AI)

Learn LLM's Hallucination and new technology called RAG

What it does

As large language models (LLMs) are integrated into more services, it's important to recognize their potential to provide incorrect information confidently, a phenomenon known as "hallucination." Users should be cautious, as LLMs are not always accurate. However, Retriever Augmented Generation (RAG) can reduce these risks by using additional data to improve response accuracy. To demonstrate the differences between LLMs and RAG-enhanced AI, we created an app featuring the "Anpanman" anime. The app compares two AIs: one using only LLM and the other with RAG. The RAG-enhanced AI provided more accurate answers, highlighting the benefits of this technology in delivering correct information. This app aims to educate users about LLM limitations and the advantages of RAG.

Built with

  • Web/Chrome
  • Django
  • Qdrant

Team

By

YusukeTomy

From

Japan