Index
BaseMemory #
Bases: BaseComponent
Base class for all memory types.
NOTE: The interface for memory is not yet finalized and is subject to change.
Source code in llama-index-core/llama_index/core/memory/types.py
13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 |
|
class_name
classmethod
#
class_name() -> str
Get class name.
Source code in llama-index-core/llama_index/core/memory/types.py
19 20 21 22 |
|
from_defaults
abstractmethod
classmethod
#
from_defaults(**kwargs: Any) -> BaseMemory
Create a chat memory from defaults.
Source code in llama-index-core/llama_index/core/memory/types.py
24 25 26 27 28 29 30 |
|
get
abstractmethod
#
get(input: Optional[str] = None, **kwargs: Any) -> List[ChatMessage]
Get chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
32 33 34 |
|
get_all
abstractmethod
#
get_all() -> List[ChatMessage]
Get all chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
36 37 38 |
|
put
abstractmethod
#
put(message: ChatMessage) -> None
Put chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
40 41 42 |
|
aput
async
#
aput(message: ChatMessage) -> None
Put chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
44 45 46 |
|
put_messages #
put_messages(messages: List[ChatMessage]) -> None
Put chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
48 49 50 51 |
|
aput_messages
async
#
aput_messages(messages: List[ChatMessage]) -> None
Put chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
53 54 55 56 |
|
set
abstractmethod
#
set(messages: List[ChatMessage]) -> None
Set chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
58 59 60 |
|
reset
abstractmethod
#
reset() -> None
Reset chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
62 63 64 |
|
BaseChatStoreMemory #
Bases: BaseMemory
Base class for any .
NOTE: The interface for memory is not yet finalized and is subject to change.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
chat_store
|
BaseChatStore
|
Simple chat store. Async methods provide same functionality as sync methods in this class. |
<dynamic>
|
chat_store_key
|
str
|
|
'chat_history'
|
Source code in llama-index-core/llama_index/core/memory/types.py
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 |
|
class_name
classmethod
#
class_name() -> str
Get class name.
Source code in llama-index-core/llama_index/core/memory/types.py
82 83 84 85 |
|
from_defaults
abstractmethod
classmethod
#
from_defaults(chat_history: Optional[List[ChatMessage]] = None, llm: Optional[LLM] = None, **kwargs: Any) -> BaseChatStoreMemory
Create a chat memory from defaults.
Source code in llama-index-core/llama_index/core/memory/types.py
87 88 89 90 91 92 93 94 95 |
|
get_all #
get_all() -> List[ChatMessage]
Get all chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
97 98 99 |
|
put #
put(message: ChatMessage) -> None
Put chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
101 102 103 104 |
|
aput
async
#
aput(message: ChatMessage) -> None
Put chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
106 107 108 109 |
|
set #
set(messages: List[ChatMessage]) -> None
Set chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
111 112 113 |
|
reset #
reset() -> None
Reset chat history.
Source code in llama-index-core/llama_index/core/memory/types.py
115 116 117 |
|