Chat Prompts Customization#
If you’re opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
!pip install llama-index
Prompt Setup#
Below, we take the default prompts and customize them to always answer, even if the context is not helpful.
from llama_index.llms import ChatMessage, MessageRole
from llama_index.prompts import ChatPromptTemplate
# Text QA Prompt
chat_text_qa_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"Always answer the question, even if the context isn't helpful."
),
),
ChatMessage(
role=MessageRole.USER,
content=(
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the question: {query_str}\n"
),
),
]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)
# Refine Prompt
chat_refine_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"Always answer the question, even if the context isn't helpful."
),
),
ChatMessage(
role=MessageRole.USER,
content=(
"We have the opportunity to refine the original answer "
"(only if needed) with some more context below.\n"
"------------\n"
"{context_msg}\n"
"------------\n"
"Given the new context, refine the original answer to better "
"answer the question: {query_str}. "
"If the context isn't useful, output the original answer again.\n"
"Original Answer: {existing_answer}"
),
),
]
refine_template = ChatPromptTemplate(chat_refine_msgs)
Using the Prompts#
Now, we use the prompts in an index query!
import openai
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
openai.api_key = os.environ["OPENAI_API_KEY"]
Download Data#
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
from llama_index import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
from llama_index.llms import OpenAI
documents = SimpleDirectoryReader("./data/paul_graham/").load_data()
# Create an index using a chat model, so that we can use the chat prompts!
service_context = ServiceContext.from_defaults(
llm=OpenAI(model="gpt-3.5-turbo", temperature=0.1)
)
index = VectorStoreIndex.from_documents(
documents, service_context=service_context
)
Before Adding Templates#
print(index.as_query_engine().query("Who is Joe Biden?"))
I'm sorry, but the given context does not provide any information about Joe Biden.
After Adding Templates#
print(
index.as_query_engine(
text_qa_template=text_qa_template, refine_template=refine_template
).query("Who is Joe Biden?")
)
Joe Biden is the 46th President of the United States.