Prompt Templatesο
These are the reference prompt templates.
We first show links to default prompts.
We then show the base prompt template class and its subclasses.
Default Promptsο
Prompt Classesο
- pydantic model llama_index.prompts.base.BasePromptTemplateο
Show JSON schema
{ "title": "BasePromptTemplate", "type": "object", "properties": { "metadata": { "title": "Metadata", "type": "object" }, "template_vars": { "title": "Template Vars", "type": "array", "items": { "type": "string" } }, "kwargs": { "title": "Kwargs", "type": "object", "additionalProperties": { "type": "string" } }, "output_parser": { "title": "Output Parser" }, "template_var_mappings": { "title": "Template Var Mappings", "description": "Template variable mappings (Optional).", "type": "object" } }, "required": [ "metadata", "template_vars", "kwargs" ] }
- Config
arbitrary_types_allowed: bool = True
- Fields
- field function_mappings: Optional[Dict[str, Callable]] [Optional]ο
Function mappings (Optional). This is a mapping from template variable names to functions that take in the current kwargs and return a string.
- field kwargs: Dict[str, str] [Required]ο
- field metadata: Dict[str, Any] [Required]ο
- field output_parser: Optional[BaseOutputParser] = Noneο
- field template_var_mappings: Optional[Dict[str, Any]] [Optional]ο
Template variable mappings (Optional).
- field template_vars: List[str] [Required]ο
- abstract format_messages(llm: Optional[LLM] = None, **kwargs: Any) List[ChatMessage] ο
- abstract partial_format(**kwargs: Any) BasePromptTemplate ο
- pydantic model llama_index.prompts.base.PromptTemplateο
Show JSON schema
{ "title": "PromptTemplate", "type": "object", "properties": { "metadata": { "title": "Metadata", "type": "object" }, "template_vars": { "title": "Template Vars", "type": "array", "items": { "type": "string" } }, "kwargs": { "title": "Kwargs", "type": "object", "additionalProperties": { "type": "string" } }, "output_parser": { "title": "Output Parser" }, "template_var_mappings": { "title": "Template Var Mappings", "description": "Template variable mappings (Optional).", "type": "object" }, "template": { "title": "Template", "type": "string" } }, "required": [ "metadata", "template_vars", "kwargs", "template" ] }
- Config
arbitrary_types_allowed: bool = True
- Fields
- field template: str [Required]ο
- format_messages(llm: Optional[LLM] = None, **kwargs: Any) List[ChatMessage] ο
Format the prompt into a list of chat messages.
- partial_format(**kwargs: Any) PromptTemplate ο
Partially format the prompt.
- pydantic model llama_index.prompts.base.ChatPromptTemplateο
Show JSON schema
{ "title": "ChatPromptTemplate", "type": "object", "properties": { "metadata": { "title": "Metadata", "type": "object" }, "template_vars": { "title": "Template Vars", "type": "array", "items": { "type": "string" } }, "kwargs": { "title": "Kwargs", "type": "object", "additionalProperties": { "type": "string" } }, "output_parser": { "title": "Output Parser" }, "template_var_mappings": { "title": "Template Var Mappings", "description": "Template variable mappings (Optional).", "type": "object" }, "message_templates": { "title": "Message Templates", "type": "array", "items": { "$ref": "#/definitions/ChatMessage" } } }, "required": [ "metadata", "template_vars", "kwargs", "message_templates" ], "definitions": { "MessageRole": { "title": "MessageRole", "description": "Message role.", "enum": [ "system", "user", "assistant", "function", "tool" ], "type": "string" }, "ChatMessage": { "title": "ChatMessage", "description": "Chat message.", "type": "object", "properties": { "role": { "default": "user", "allOf": [ { "$ref": "#/definitions/MessageRole" } ] }, "content": { "title": "Content", "default": "" }, "additional_kwargs": { "title": "Additional Kwargs", "type": "object" } } } } }
- Config
arbitrary_types_allowed: bool = True
- Fields
- field message_templates: List[ChatMessage] [Required]ο
- format_messages(llm: Optional[LLM] = None, **kwargs: Any) List[ChatMessage] ο
- partial_format(**kwargs: Any) ChatPromptTemplate ο
- pydantic model llama_index.prompts.base.SelectorPromptTemplateο
Show JSON schema
{ "title": "SelectorPromptTemplate", "type": "object", "properties": { "metadata": { "title": "Metadata", "type": "object" }, "template_vars": { "title": "Template Vars", "type": "array", "items": { "type": "string" } }, "kwargs": { "title": "Kwargs", "type": "object", "additionalProperties": { "type": "string" } }, "output_parser": { "title": "Output Parser" }, "template_var_mappings": { "title": "Template Var Mappings", "description": "Template variable mappings (Optional).", "type": "object" }, "default_template": { "title": "Default Template" }, "conditionals": { "title": "Conditionals" } }, "required": [ "metadata", "template_vars", "kwargs" ] }
- Config
arbitrary_types_allowed: bool = True
- Fields
- field conditionals: Optional[List[Tuple[Callable[[LLM], bool], BasePromptTemplate]]] = Noneο
- field default_template: BasePromptTemplate [Required]ο
- format_messages(llm: Optional[LLM] = None, **kwargs: Any) List[ChatMessage] ο
Format the prompt into a list of chat messages.
- partial_format(**kwargs: Any) SelectorPromptTemplate ο
- select(llm: Optional[LLM] = None) BasePromptTemplate ο
- pydantic model llama_index.prompts.base.LangchainPromptTemplateο
Show JSON schema
{ "title": "LangchainPromptTemplate", "type": "object", "properties": { "metadata": { "title": "Metadata", "type": "object" }, "template_vars": { "title": "Template Vars", "type": "array", "items": { "type": "string" } }, "kwargs": { "title": "Kwargs", "type": "object", "additionalProperties": { "type": "string" } }, "output_parser": { "title": "Output Parser" }, "template_var_mappings": { "title": "Template Var Mappings", "description": "Template variable mappings (Optional).", "type": "object" }, "selector": { "title": "Selector" }, "requires_langchain_llm": { "title": "Requires Langchain Llm", "default": false, "type": "boolean" } }, "required": [ "metadata", "template_vars", "kwargs" ] }
- Config
arbitrary_types_allowed: bool = True
- Fields
- field requires_langchain_llm: bool = Falseο
- field selector: Any = Noneο
- format_messages(llm: Optional[LLM] = None, **kwargs: Any) List[ChatMessage] ο
Format the prompt into a list of chat messages.
- partial_format(**kwargs: Any) BasePromptTemplate ο
Partially format the prompt.