How to read these docs#
Welcome to the LlamaIndex documentation! We’ve tried hard to make these docs approachable regardless of your experience level with LlamaIndex and with LLMs and generative AI in general.
Before you start#
LlamaIndex is a Python library, so you should have Python installed and a basic working understanding of how to write it. If you prefer JavaScript, we recommend trying out our TypeScript package.
Many of our examples are formatted as Notebooks, by which we mean Jupyter-style notebooks. You don’t have to have Jupyter installed; you can try out most of our examples on a hosted service like Google Colab.
Structure of these docs#
Our docs are structured so you should be able to roughly progress simply by moving down the sidebar on the left, or just hitting the “next” link at the bottom of each page.
Getting started
The section you’re in right now. We can get you going from knowing nothing about LlamaIndex and LLMs. Install the library, write your first demo in five lines of code, learn more about the high level concepts of LLM applications, and then see how you can customize the five-line example to meet your needs.
Use cases
If you’re a dev trying to figure out whether LlamaIndex will work for your use case, we have an overview of the types of things you can build.
Understanding LlamaIndex
Once you’ve completed the Getting Started section, this is the next place to go. In a series of bite-sized tutorials, we’ll walk you through every stage of building a production LlamaIndex application and help you level up on the concepts of the library and LLMs in general as you go.
Optimizing
Already got a working LlamaIndex application and looking to further refine it? Our optimizing section will walk you through the first things you should try like your embedding model and chunk size through progressively more complex and subtle customizations all the way to fine tuning your model.
Module guides
Arranged in the same order of building an LLM application as our Understanding section, these are comprehensive, lower-level guides to the individual components of LlamaIndex and how to use them.