API
LLMSettings
LLM base settings
provider_id: Literal["openai", "gemini"]
Literal["openai", "gemini"]
Description: Third-party provider id.
config: Union[OpenAISettings, GeminiSettings]
Union[OpenAISettings, GeminiSettings]
Description: Third-party provider settings.
prompt_utils: Optional[PromptUtilsSettings]
Optional[PromptUtilsSettings]
Description: Prompt utils settings. Default: None
OpenAISettings
OpenAI settings
api_type: Optional[Literal["open_ai", "azure", "azure_ad"]]
Optional[Literal["open_ai", "azure", "azure_ad"]]
Description: OpenAI API type. Available options are: "open_ai"
| "azure"
| "azure_ad"
. If None
would default to "open_ai"
.
Default: None
azure_endpoint: Optional[str]
Optional[str]
Description: Azure endpoint, including the resource. See also base_url
.
Default: None
azure_deployment: Optional[str]
Optional[str]
Description: Azure deployment. A model deployment, if given sets the base client URL to include /deployments/{azure_deployment}
. Note: this means you won't be able to use non-deployment endpoints. Not supported with Assistants APIs. See also base_url
.
Default: None
api_version: Optional[str]
Optional[str]
Description: API version. Only needed for Azure API.
Default: None
azure_ad_token: Optional[str]
Optional[str]
Description: Azure Active Directory token.
Default: None
model_id: Optional[str]
Optional[str]
Description: OpenAI model id. In azure this should be the deployment_name.
Default: None
model_type: Literal["completions", "chat.completions", "embeddings", "images.generations"]
Literal["completions", "chat.completions", "embeddings", "images.generations"]
Description: OpenAI model type. Available options are: "chat.completions"
| "completions"
| "embeddings"
| "images.generations"
."""
Default: "chat.completions"
api_key: Optional[str]
Optional[str]
Description: OpenAI API key.
Default: None
organization: Optional[str]
Optional[str]
Description: OpenAI organization.
Default: None
project: Optional[str]
Optional[str]
Description: OpenAI project.
Default: None
base_url: Optional[str]
Optional[str]
Description: OpenAI base url. Note: When using Azure, base_url
is set to either "{azure_endpoint}/openai/deployments/{azure_deployment}"
or "{azure_endpoint}/openai"
. Thus, base_url
and azure_endpoint
are mutually exclusive.
Default: None
llm_parameters: Optional[dict]
Optional[dict]
Description: OpenAI LLM parameters. This is a dictionary of parameters to be passed to the OpenAI LLM API.
Default: None
GeminiSettings
Gemini API settings
model_id: Optional[str]
Optional[str]
Description: Gemini model id.
Default: None
model_type: `Literal["completions", "chat.completions"]
Description: Gemini model type. Available options are: "chat.completions"
| "completions"
Default: "chat.completions"
api_key: Optional[str]
Optional[str]
Description: Gemini API key.
Default: None
llm_parameters: Optional[GeminiGenerateContentParameters]
Optional[GeminiGenerateContentParameters]
Description: Gemini LLM parameters. This is a dictionary of parameters to be passed to the Gemini LLM API.
Default: None
Last updated
Was this helpful?