Gradient Base Model๏ƒ

pydantic model llama_index.llms.gradient.GradientBaseModelLLM๏ƒ

Show JSON schema
{
   "title": "GradientBaseModelLLM",
   "description": "Simple abstract base class for custom LLMs.\n\nSubclasses must implement the `__init__`, `complete`,\n    `stream_complete`, and `metadata` methods.",
   "type": "object",
   "properties": {
      "callback_manager": {
         "title": "Callback Manager"
      },
      "max_tokens": {
         "title": "Max Tokens",
         "description": "The number of tokens to generate.",
         "exclusiveMinimum": 0,
         "exclusiveMaximum": 512,
         "type": "integer"
      },
      "access_token": {
         "title": "Access Token",
         "description": "The Gradient access token to use.",
         "type": "string"
      },
      "host": {
         "title": "Host",
         "description": "The url of the Gradient service to access.",
         "type": "string"
      },
      "workspace_id": {
         "title": "Workspace Id",
         "description": "The Gradient workspace id to use.",
         "type": "string"
      },
      "base_model_slug": {
         "title": "Base Model Slug",
         "description": "The slug of the base model to use.",
         "type": "string"
      }
   },
   "required": [
      "base_model_slug"
   ]
}

Config
  • arbitrary_types_allowed: bool = True

Fields
Validators
  • _validate_callback_manager ยป callback_manager

field access_token: Optional[str] = None๏ƒ

The Gradient access token to use.

field base_model_slug: str [Required]๏ƒ

The slug of the base model to use.

field host: Optional[str] = None๏ƒ

The url of the Gradient service to access.

field max_tokens: Optional[int] = None๏ƒ

The number of tokens to generate.

Constraints
  • exclusiveMinimum = 0

  • exclusiveMaximum = 512

field workspace_id: Optional[str] = None๏ƒ

The Gradient workspace id to use.

close() None๏ƒ
complete(*args: Any, **kwargs: Any) Any๏ƒ

Completion endpoint for LLM.

stream_complete(prompt: str, **kwargs: Any) Generator[CompletionResponse, None, None]๏ƒ

Streaming completion endpoint for LLM.

property metadata: LLMMetadata๏ƒ

LLM metadata.