Replicateο
- pydantic model llama_index.llms.replicate.Replicateο
Show JSON schema
{ "title": "Replicate", "description": "Simple abstract base class for custom LLMs.\n\nSubclasses must implement the `__init__`, `complete`,\n `stream_complete`, and `metadata` methods.", "type": "object", "properties": { "callback_manager": { "title": "Callback Manager" }, "model": { "title": "Model", "description": "The Replicate model to use.", "type": "string" }, "temperature": { "title": "Temperature", "description": "The temperature to use for sampling.", "default": 0.75, "gte": 0.01, "lte": 1.0, "type": "number" }, "image": { "title": "Image", "description": "The image file for multimodal model to use. (optional)", "type": "string" }, "context_window": { "title": "Context Window", "description": "The maximum number of context tokens for the model.", "default": 3900, "exclusiveMinimum": 0, "type": "integer" }, "prompt_key": { "title": "Prompt Key", "description": "The key to use for the prompt in API calls.", "type": "string" }, "additional_kwargs": { "title": "Additional Kwargs", "description": "Additional kwargs for the Replicate API.", "type": "object" }, "is_chat_model": { "title": "Is Chat Model", "description": "Whether the model is a chat model.", "default": false, "type": "boolean" }, "class_name": { "title": "Class Name", "type": "string", "default": "Replicate_llm" } }, "required": [ "model", "image", "prompt_key" ] }
- Config
arbitrary_types_allowed: bool = True
- Fields
- Validators
_validate_callback_manager
Β»callback_manager
- field additional_kwargs: Dict[str, Any] [Optional]ο
Additional kwargs for the Replicate API.
- field context_window: int = 3900ο
The maximum number of context tokens for the model.
- Constraints
exclusiveMinimum = 0
- field image: str [Required]ο
The image file for multimodal model to use. (optional)
- field is_chat_model: bool = Falseο
Whether the model is a chat model.
- field model: str [Required]ο
The Replicate model to use.
- field prompt_key: str [Required]ο
The key to use for the prompt in API calls.
- field temperature: float = 0.75ο
The temperature to use for sampling.
- chat(messages: Sequence[ChatMessage], **kwargs: Any) Any ο
Chat endpoint for LLM.
- classmethod class_name() str ο
Get the class name, used as a unique ID in serialization.
This provides a key that makes serialization robust against actual class name changes.
- complete(*args: Any, **kwargs: Any) Any ο
Completion endpoint for LLM.
- stream_chat(messages: Sequence[ChatMessage], **kwargs: Any) Any ο
Streaming chat endpoint for LLM.
- stream_complete(*args: Any, **kwargs: Any) Any ο
Streaming completion endpoint for LLM.
- property metadata: LLMMetadataο
LLM metadata.