less than 1 minute read

Session III: Creating a more robust RAQA system using LlamaIndex

I’m working on improving my RAG-building skills. As a part of that, I’m taking courses online. The course I’m currently taking is from AI Makerspace, as part of their LLMOps curriculum.

The related project notebook is here, and my own version of that notebook here.


Like before, we’ll be working on creating an AI chat agent to answer questions about either the Barbie or Oppenheimer movies. However, instead of Langchain, we’ll be using a different framework called LlamaIndex.

Overview of LlamaIndex

LlamaIndex is, as they themselves describe it, a data framework built for LLMs. LlamaIndex does well at integrating multiple different data sources, creating ways to structure the data so it can be easily queried or indexed, and exposing helper tools for easily querying and fetching the data when needed. By default, LlamaIndex uses OpenAI as its LLM provider, though it supports integrations with many LLMs. LlamaIndex often is used with Langchain in order to build end-to-end LLM agent interfaces. I worked through a few examples in this blog post.