Module Usage#
Currently the following LlamaIndex modules are supported within a QueryPipeline. Remember, you can define your own!
LLMs (both completion and chat)#
Base class:
LLM
If chat model:
Input:
messages
. Takes in anyList[ChatMessage]
or any stringable input.Output:
output
. OutputsChatResponse
(stringable)
If completion model:
Input:
prompt
. Takes in any stringable input.Output:
output
. OutputsCompletionResponse
(stringable)
Prompts#
Base class:
PromptTemplate
Input: Prompt template variables. Each variable can be a stringable input.
Output:
output
. Outputs formatted prompt string (stringable)
Query Engines#
Base class:
BaseQueryEngine
Input:
input
. Takes in any stringable input.Output:
output
. OutputsResponse
(stringable)
Query Transforms#
Base class:
BaseQueryTransform
Input:
query_str
,metadata
(optional).query_str
is any stringable input.Output:
query_str
. Outputs string.
Retrievers#
Base class:
BaseRetriever
Input:
input
. Takes in any stringable input.Output:
output
. Outputs list of nodesList[BaseNode]
.
Output Parsers#
Base class:
BaseOutputParser
Input:
input
. Takes in any stringable input.Output:
output
. Outputs whatever type output parser is supposed to parse out.
Postprocessors/Rerankers#
Base class:
BaseNodePostprocessor
Input:
nodes
,query_str
(optional).nodes
isList[BaseNode]
,query_str
is any stringable input.Output:
nodes
. Outputs list of nodesList[BaseNode]
.
Response Synthesizers#
Base class:
BaseSynthesizer
Input:
nodes
,query_str
.nodes
isList[BaseNode]
,query_str
is any stringable input.Output:
output
. OutputsResponse
object (stringable).
Other QueryPipeline objects#
You can define a QueryPipeline
as a module within another query pipeline. This makes it easy for you to string together complex workflows.
Custom Components#
See our custom components guide for more details.