Sambanova
SambaNova¶
Example notebook on how to use Sambaverse and SambaStudio offerings from SambaNova¶
If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
%pip install llama-index-llms-sambanova
!pip install llama-index
Sambaverse¶
Setup on account on SambaNova
Generate a new API token by clicking on the profile
Identify the model name from the playground
Import the Sambaverse and use the LLM
from llama_index.llms.sambanova import Sambaverse
Configure the environment variables
import os
os.environ["SAMBAVERSE_API_KEY"] = "you sambaverse api key"
os.environ["SAMBAVERSE_MODEL_NAME"] = "you sambaverse model name"
# Example model name = Meta/Meta-Llama-3-8B
Create an LLM instance
llm = Sambaverse(
streaming=False,
model_kwargs={
"do_sample": False,
"process_prompt": False,
"select_expert": "Meta-Llama-3-8B",
"stop_sequences": "",
},
)
To know more about the model kwargs, kindly refer here
Completion response
response = llm.complete("What is the capital of India?") print(response)
Stream complete response
stream_response = llm.stream_complete("What is the capital of India")
for response in stream_response:
print(response)
SambaStudio¶
Setup on account on SambaNova for SambaStudio
Create a project.
Configure the model name and endpoint.
Import SambaStudio and use the LLM
from llama_index.llms.sambanova import SambaStudio
Configure the environment variables
import os
os.environ["SAMBAVERSE_API_KEY"] = "you sambastudio api key"
os.environ["SAMBASTUDIO_BASE_URL"] = "you sambastudio base_url"
os.environ["SAMBASTUDIO_BASE_URI"] = "you sambastudio base_uri"
os.environ["SAMBASTUDIO_PROJECT_ID"] = "you sambastudio project_id"
os.environ["SAMBASTUDIO_ENDPOINT_ID"] = "you sambastudio endpoint_id"
Create a SambaStudio instance
llm = Sambaverse(
streaming=False,
model_kwargs={
"do_sample": True,
"process_prompt": True,
"max_tokens_to_generate": 1000,
"temperature": 0.8,
},
)
To know more about the model kwargs, kindly refer here
Complete response
response = llm.complete("What is the capital of India?")
print(response)
Stream complete response
stream_response = llm.stream_complete("What is the capital of India")
for response in stream_response:
print(response)