Skip to content

How to read these docs#

Welcome to the LlamaIndex documentation! We've tried hard to make these docs approachable regardless of your experience level with LlamaIndex and with LLMs and generative AI in general.

Before you start#

LlamaIndex is a Python library, so you should have Python installed and a basic working understanding of how to write it. If you prefer JavaScript, we recommend trying out our TypeScript package.

Many of our examples are formatted as Notebooks, by which we mean Jupyter-style notebooks. You don't have to have Jupyter installed; you can try out most of our examples on a hosted service like Google Colab.

Structure of these docs#

Our docs are structured so you should be able to roughly progress simply by moving across the links at the top of the page from left to right, or just hitting the "next" link at the bottom of each page.

  1. Getting started: The section you're in right now. We can get you going from knowing nothing about LlamaIndex and LLMs. Install the library, write your first demo in five lines of code, learn more about the high level concepts of LLM applications, and then see how you can customize the five-line example to meet your needs.

  2. Learn: Once you've completed the Getting Started section, this is the next place to go. In a series of bite-sized tutorials, we'll walk you through every stage of building a production LlamaIndex application and help you level up on the concepts of the library and LLMs in general as you go.

  3. Use cases: If you're a dev trying to figure out whether LlamaIndex will work for your use case, we have an overview of the types of things you can build.

  4. Examples: We have rich notebook examples for nearly every feature under the sun. Explore these to find and learn something new about LlamaIndex.

  5. Component guides: Arranged in the same order of building an LLM application as our Learn section, these are comprehensive, lower-level guides to the individual components of LlamaIndex and how to use them.

  6. Advanced Topics: Already got a working LlamaIndex application and looking to further refine it? Our advanced section will walk you through the first things you should try optimizing like your embedding model and chunk size through progressively more complex and subtle customizations all the way to fine tuning your model.