Writing Custom Modules#
A core design principle of LlamaIndex is that almost every core module can be subclassed and customized.
This allows you to use LlamaIndex for any advanced LLM use case, beyond the capabilities offered by our prepackaged modules. You’re free to write as much custom code for any given module, but still take advantage of our lower-level abstractions and also plug this module along with other components.
We offer convenient/guided ways to subclass our modules, letting you write your custom logic without having to worry about having to define all boilerplate (for instance, callbacks).
This guide centralizes all the resources around writing custom modules in LlamaIndex. Check them out below 👇