Bring Your Own LLM
Last updated
Last updated
This page provides information on how to configure customer-provided LLMs with Salem.
OpenAI - Chat GPT
An API-key for the organization's chosen LLM
The image below is Salem's default LLM ActionDefinition. More general information on ActionDefinition parameters and their purpose can be found in the ActionDefinition documentation.
See the description below the image for an explanation of how to configure the ActionDefinition for an LLM integration.
The ActionDefinition name must be the same as the name referenced by the corresponding ActionConf, or the request will fail. By default, Salem uses the name "External LLM."
Authentication relies on your organization's default key vault resource that must have a secret that contains the secret value for your LLM. For more information on setting up ActionConf requests, reference the ActionConf documentation.
secret_name
: Secret containing the LLM's API secret value.
As of v1.5.2, the LLM configuration expects an "llm_type" of "OpenAI" with the version defined in the static_keys parameter.
The input_keys defined in the ActionDefinition must match the keys in the params object in the corresponding ActionConf. In the default ActionConf, the required params are:
temperature
- The temperature to be defined in the ActionConf
additional_instructions
- Additional instructions to be defined in the ActionConf
If an LLM has not been setup or has been incorrectly configured, an admin user will receive the following message when attempting to use an LLM feature:
Make sure the ActionDefinition referenced in the "definition" of the params object of the ActionConf matches the ActionDefinition
For the Ask Salem feature: make sure the ActionConf is called "AskSalem"
Check to make the parameters referenced in the ActionConf matches the parameters in the input_keys of the ActionDefinition.