Anthropic
- pydantic model llama_index.llms.anthropic.Anthropic
Show JSON schema
{ "title": "Anthropic", "description": "LLM interface.", "type": "object", "properties": { "callback_manager": { "title": "Callback Manager" }, "model": { "title": "Model", "description": "The anthropic model to use.", "default": "claude-2", "type": "string" }, "temperature": { "title": "Temperature", "description": "The temperature to use for sampling.", "default": 0.1, "gte": 0.0, "lte": 1.0, "type": "number" }, "max_tokens": { "title": "Max Tokens", "description": "The maximum number of tokens to generate.", "default": 512, "exclusiveMinimum": 0, "type": "integer" }, "base_url": { "title": "Base Url", "description": "The base URL to use.", "type": "string" }, "timeout": { "title": "Timeout", "description": "The timeout to use in seconds.", "gte": 0, "type": "number" }, "max_retries": { "title": "Max Retries", "description": "The maximum number of API retries.", "default": 10, "gte": 0, "type": "integer" }, "additional_kwargs": { "title": "Additional Kwargs", "description": "Additional kwargs for the anthropic API.", "type": "object" }, "class_name": { "title": "Class Name", "type": "string", "default": "Anthropic_LLM" } } }
- Config
arbitrary_types_allowed: bool = True
- Fields
- Validators
_validate_callback_manager
»callback_manager
- field additional_kwargs: Dict[str, Any] [Optional]
Additional kwargs for the anthropic API.
- field base_url: Optional[str] = None
The base URL to use.
- field max_retries: int = 10
The maximum number of API retries.
- field max_tokens: int = 512
The maximum number of tokens to generate.
- Constraints
exclusiveMinimum = 0
- field model: str = 'claude-2'
The anthropic model to use.
- field temperature: float = 0.1
The temperature to use for sampling.
- field timeout: Optional[float] = None
The timeout to use in seconds.
- async achat(messages: Sequence[ChatMessage], **kwargs: Any) Any
Async chat endpoint for LLM.
- async acomplete(*args: Any, **kwargs: Any) Any
Async completion endpoint for LLM.
- async astream_chat(messages: Sequence[ChatMessage], **kwargs: Any) Any
Async streaming chat endpoint for LLM.
- async astream_complete(*args: Any, **kwargs: Any) Any
Async streaming completion endpoint for LLM.
- chat(messages: Sequence[ChatMessage], **kwargs: Any) Any
Chat endpoint for LLM.
- classmethod class_name() str
Get the class name, used as a unique ID in serialization.
This provides a key that makes serialization robust against actual class name changes.
- complete(*args: Any, **kwargs: Any) Any
Completion endpoint for LLM.
- stream_chat(messages: Sequence[ChatMessage], **kwargs: Any) Any
Streaming chat endpoint for LLM.
- stream_complete(*args: Any, **kwargs: Any) Any
Streaming completion endpoint for LLM.
- property metadata: LLMMetadata
LLM metadata.