Mem0¶
Mem0 (pronounced “mem-zero”) enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. It remembers user preferences and traits and continuously updates over time, making it ideal for applications like customer support chatbots and AI assistants.
Mem0 offers two powerful ways to leverage our technology: our managed platform and our open source solution.
If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
%pip install llama-index-memory-mem0
!pip install llama-index
Setup with Mem0 Platform¶
Set your Mem0 Platform API key as an environment variable. You can replace <your-mem0-api-key>
with your actual API key:
Note: You can obtain your Mem0 Platform API key from the Mem0 Platform.
import os
os.environ["MEM0_API_KEY"] = "<your-mem0-api-key>"
Using from_client
(for Mem0 platform API):
from llama_index.memory.mem0 import Mem0Memory
context = {"user_id": "test_user_1"}
memory_from_client = Mem0Memory.from_client(
context=context,
api_key="<your-api-key>",
search_msg_limit=4, # Default is 5
)
Mem0 Context is used to identify the user, agent or the conversation in the Mem0. It is required to be passed in the at least one of the fields in the Mem0Memory
constructor.
search_msg_limit
is optional, default is 5. It is the number of messages from the chat history to be used for memory retrieval from Mem0. More number of messages will result in more context being used for retrieval but will also increase the retrieval time and might result in some unwanted results.
Using from_config
(for Mem0 OSS)
os.environ["OPENAI_API_KEY"] = "<your-api-key>"
config = {
"vector_store": {
"provider": "qdrant",
"config": {
"collection_name": "test_9",
"host": "localhost",
"port": 6333,
"embedding_model_dims": 1536, # Change this according to your local model's dimensions
},
},
"llm": {
"provider": "openai",
"config": {
"model": "gpt-4o",
"temperature": 0.2,
"max_tokens": 1500,
},
},
"embedder": {
"provider": "openai",
"config": {"model": "text-embedding-3-small"},
},
"version": "v1.1",
}
memory_from_config = Mem0Memory.from_config(
context=context,
config=config,
search_msg_limit=4, # Default is 5
)
Initialize LLM¶
from llama_index.llms.openai import OpenAI
llm = OpenAI(model="gpt-4o")
Mem0 for Function Calling Agents¶
Use Mem0
as memory for FunctionCallingAgents
.
from llama_index.core.tools import FunctionTool
from llama_index.core.agent import FunctionCallingAgent
import nest_asyncio
nest_asyncio.apply()
Initialize Tools¶
def call_fn(name: str):
"""Call the provided name.
Args:
name: str (Name of the person)
"""
print(f"Calling... {name}")
def email_fn(name: str):
"""Email the provided name.
Args:
name: str (Name of the person)
"""
print(f"Emailing... {name}")
call_tool = FunctionTool.from_defaults(fn=call_fn)
email_tool = FunctionTool.from_defaults(fn=email_fn)
agent = FunctionCallingAgent.from_tools(
[call_tool, email_tool],
llm=llm,
memory=memory_from_client, # can be memory_from_config
verbose=True,
)
response = agent.chat("Hi, My name is Mayank.")
> Running step 47f7d617-4756-4f7c-8858-baf6d8e0ce3a. Step input: Hi, My name is Mayank. Added user message to memory: Hi, My name is Mayank. === LLM Response === Hello Mayank! How can I assist you today?
response = agent.chat("My preferred way of communication would be Email.")
> Running step f08de932-bd3d-4ded-a701-5b2c6bc62788. Step input: My preferred way of communication would be Email. Added user message to memory: My preferred way of communication would be Email. === LLM Response === Got it, Mayank! Your preferred way of communication is email. How can I assist you further?
response = agent.chat("Send me an update of your product.")
> Running step ccf8aae2-4e0e-459e-be6c-ead62d330c96. Step input: Send me an update of your product. Added user message to memory: Send me an update of your product. === Calling Function === Calling function: email_fn with args: {"name": "Mayank"} Emailing... Mayank === Function Output === None > Running step e26553f6-6b3f-46d1-aa84-9ba34879461f. Step input: None === LLM Response === I've sent you an update of our product via email. If you have any other questions or need further assistance, feel free to let me know!
Mem0 for Chat Engines¶
Use Mem0
as memory to SimpleChatEngine
.
# Initialize chat engine
from llama_index.core.chat_engine.simple import SimpleChatEngine
agent = SimpleChatEngine.from_defaults(
llm=llm, memory=memory_from_client # can be memory_from_config
)
response = agent.chat("Hi, My name is mayank")
print(response)
Hello Mayank! How can I assist you today?
response = agent.chat("I am planning to visit SF tommorow.")
print(response)
That sounds exciting! San Francisco has a lot to offer. If you need any recommendations on places to visit or things to do, feel free to ask. Safe travels!
response = agent.chat(
"What would be a suitable time to schedule a meeting tommorow?"
)
print(response)
Since you're planning to visit San Francisco tomorrow, it might be best to schedule meetings either in the morning before you start exploring or in the late afternoon or evening after you've had some time to enjoy the city. This way, you can make the most of your visit without feeling rushed. Let me know if you need help with anything else!
Mem0 for ReAct Agents¶
Use Mem0
as memory for ReActAgent
.
from llama_index.core.agent import ReActAgent
agent = ReActAgent.from_tools(
[call_tool, email_tool],
llm=llm,
memory=memory_from_client, # can be memory_from_config
verbose=True,
)
response = agent.chat("Hi, My name is Mayank.")
> Running step fb3acef3-f806-493e-838f-eae950df54f4. Step input: Hi, My name is Mayank.
Thought: (Implicit) I can answer without any more tools!
Answer: Hello Mayank! How can I assist you today?
response = agent.chat("My preferred way of communication would be Email.")
> Running step e1dfafe0-7fb1-4f37-a3bd-a4d93ef7e19f. Step input: My preferred way of communication would be Email.
Thought: (Implicit) I can answer without any more tools!
Answer: Got it, Mayank! If you need to communicate or schedule anything, I'll make sure to use email as your preferred method. Let me know if there's anything specific you need help with!
response = agent.chat("Send me an update of your product.")
> Running step e4a6f24c-0008-41d7-8fe9-53d41f26cd87. Step input: Send me an update of your product. Thought: The current language of the user is English. I need to use a tool to help me send an update via email. Action: email_fn Action Input: {'name': 'Mayank'} Emailing... Mayank Observation: None > Running step aec65a36-eabf-473b-8572-0cb02b25969b. Step input: None Thought: I have sent the email to Mayank with the product update. I can now confirm this action. Answer: I have sent you an update of our product via email. Please check your inbox. Let me know if there's anything else you need!
response = agent.chat("First call me and then communicate me requirements.")
> Running step c3d61bd9-6415-49b3-94ba-8cb0ee717e3a. Step input: First call me and then communicate me requirements. Thought: The current language of the user is English. I need to use a tool to help me answer the question. Action: call_fn Action Input: {'name': 'Mayank'} Calling... Mayank Observation: None > Running step 6a594cb7-af31-48e3-b965-7fef88b03084. Step input: None Thought: Since the call did not go through, I will proceed with the next step, which is to communicate via email as per your preference. Action: email_fn Action Input: {'name': 'Mayank'} Emailing... Mayank Observation: None > Running step 72e96d48-afcc-48fc-83d7-5c0232dd7a92. Step input: None Thought: I have attempted to call and email you, but there seems to be no response from the tools. I will provide the information here instead. Answer: I attempted to call you, but it seems there was an issue. I'll proceed with providing the requirements update here. Please let me know if you have any specific requirements or updates you need, and I'll be happy to assist you!