Function Calling Anthropic Agent¶
This notebook shows you how to use our Anthropic agent, powered by function calling capabilities.
NOTE: Only claude-3 models support function calling using Anthropic's API.
Initial Setup¶
Let's start by importing some simple building blocks.
The main thing we need is:
- the Anthropic API (using our own
llama_index
LLM class) - a place to keep conversation history
- a definition for tools that our agent can use.
If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
%pip install llama-index-llms-anthropic
%pip install llama-index-embeddings-openai
!pip install llama-index
from llama_index.llms.anthropic import Anthropic
from llama_index.core.tools import FunctionTool
import nest_asyncio
nest_asyncio.apply()
Let's define some very simple calculator tools for our agent.
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
multiply_tool = FunctionTool.from_defaults(fn=multiply)
def add(a: int, b: int) -> int:
"""Add two integers and returns the result integer"""
return a + b
add_tool = FunctionTool.from_defaults(fn=add)
Make sure your ANTHROPIC_API_KEY is set. Otherwise explicitly specify the api_key
parameter.
llm = Anthropic(model="claude-3-opus-20240229", api_key="sk-ant-...")
Initialize Anthropic Agent¶
Here we initialize a simple Mistral agent with calculator functions.
from llama_index.core.agent import FunctionCallingAgent
agent = FunctionCallingAgent.from_tools(
[multiply_tool, add_tool],
llm=llm,
verbose=True,
allow_parallel_tool_calls=False,
)
Chat¶
response = agent.chat("What is (121 + 2) * 5?")
print(str(response))
Added user message to memory: What is (121 + 2) * 5? === Calling Function === Calling function: add with args: {"a": 121, "b": 2} === Calling Function === Calling function: multiply with args: {"a": 123, "b": 5} assistant: Therefore, (121 + 2) * 5 = 615
# inspect sources
print(response.sources)
[ToolOutput(content='123', tool_name='add', raw_input={'args': (), 'kwargs': {'a': 121, 'b': 2}}, raw_output=123), ToolOutput(content='615', tool_name='multiply', raw_input={'args': (), 'kwargs': {'a': 123, 'b': 5}}, raw_output=615)]
Async Chat¶
Also let's re-enable parallel function calling so that we can call two multiply
operations simultaneously.
# enable parallel function calling
agent = FunctionCallingAgent.from_tools(
[multiply_tool, add_tool],
llm=llm,
verbose=True,
allow_parallel_tool_calls=True,
)
response = await agent.achat("What is (121 * 3) + (5 * 8)?")
print(str(response))
Added user message to memory: What is (121 * 3) + (5 * 8)? === Calling Function === Calling function: multiply with args: {"a": 121, "b": 3} === Calling Function === Calling function: multiply with args: {"a": 5, "b": 8} === Calling Function === Calling function: add with args: {"a": 363, "b": 40} assistant: Therefore, the result of (121 * 3) + (5 * 8) is 403.
Anthropic Agent over RAG Pipeline¶
Build a Anthropic agent over a simple 10K document. We use OpenAI embeddings and claude-3-haiku-20240307 to construct the RAG pipeline, and pass it to the Anthropic Opus agent as a tool.
!mkdir -p 'data/10k/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/10k/uber_2021.pdf' -O 'data/10k/uber_2021.pdf'
--2024-04-04 18:12:42-- https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/10k/uber_2021.pdf Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.111.133, 185.199.108.133, 185.199.109.133, ... Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.111.133|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 1880483 (1.8M) [application/octet-stream] Saving to: ‘data/10k/uber_2021.pdf’ data/10k/uber_2021. 100%[===================>] 1.79M 6.09MB/s in 0.3s 2024-04-04 18:12:43 (6.09 MB/s) - ‘data/10k/uber_2021.pdf’ saved [1880483/1880483]
from llama_index.core.tools import QueryEngineTool, ToolMetadata
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.llms.anthropic import Anthropic
embed_model = OpenAIEmbedding(api_key="sk-...")
query_llm = Anthropic(model="claude-3-haiku-20240307", api_key="sk-ant-...")
# load data
uber_docs = SimpleDirectoryReader(
input_files=["./data/10k/uber_2021.pdf"]
).load_data()
# build index
uber_index = VectorStoreIndex.from_documents(
uber_docs, embed_model=embed_model
)
uber_engine = uber_index.as_query_engine(similarity_top_k=3, llm=query_llm)
query_engine_tool = QueryEngineTool(
query_engine=uber_engine,
metadata=ToolMetadata(
name="uber_10k",
description=(
"Provides information about Uber financials for year 2021. "
"Use a detailed plain text question as input to the tool."
),
),
)
from llama_index.core.agent import FunctionCallingAgent
agent = FunctionCallingAgent.from_tools(
[query_engine_tool], llm=llm, verbose=True
)
response = agent.chat("Tell me both the risk factors and tailwinds for Uber?")
print(str(response))
Added user message to memory: Tell me both the risk factors and tailwinds for Uber? === Calling Function === Calling function: uber_10k with args: {"input": "What were some of the key risk factors and tailwinds mentioned for Uber's business in 2021?"} assistant: In summary, some of the key risk factors Uber faced in 2021 included regulatory challenges, IP protection, staying competitive with new technologies, seasonality and forecasting challenges due to COVID-19, and risks of international expansion. However, Uber also benefited from tailwinds like accelerated growth in food delivery due to the pandemic and adapting well to new remote work arrangements.