Simple Chat Engine

class llama_index.chat_engine.simple.SimpleChatEngine(llm: LLM, memory: BaseMemory, prefix_messages: List[ChatMessage], callback_manager: Optional[CallbackManager] = None)

Simple Chat Engine.

Have a conversation with the LLM. This does not make use of a knowledge base.

async achat(*args: Any, **kwargs: Any) Any

Async version of main chat interface.

async astream_chat(*args: Any, **kwargs: Any) Any

Async version of main chat interface.

chat(*args: Any, **kwargs: Any) Any

Main chat interface.

property chat_history: List[ChatMessage]

Get chat history.

chat_repl() None

Enter interactive chat REPL.

classmethod from_defaults(service_context: ~typing.Optional[~llama_index.service_context.ServiceContext] = None, chat_history: ~typing.Optional[~typing.List[~llama_index.llms.base.ChatMessage]] = None, memory: ~typing.Optional[~llama_index.memory.types.BaseMemory] = None, memory_cls: ~typing.Type[~llama_index.memory.types.BaseMemory] = <class 'llama_index.memory.chat_memory_buffer.ChatMemoryBuffer'>, system_prompt: ~typing.Optional[str] = None, prefix_messages: ~typing.Optional[~typing.List[~llama_index.llms.base.ChatMessage]] = None, **kwargs: ~typing.Any) SimpleChatEngine

Initialize a SimpleChatEngine from default parameters.

reset() None

Reset conversation state.

stream_chat(*args: Any, **kwargs: Any) Any

Stream chat interface.