"llm" ActionConfs
Last updated
Last updated
"llm" actions are created by default to support the LLM features embedded in Salem. Admins can modify the response from an LLM request by editing the specified ActionConf for a Salem feature. To update the ActionConf, the user can access via the Admin menu by entering view -a
in Salem Chat and then selecting Configs > ActionConfs.
OpenAI - Chat GPT
"Ask Salem" :
id: "AskSalem",
The image below is Salem's default LLM ActionConf. More general information on ActionConf construction and their purpose can be found in the Action Conf documentation.
The feature name defined that uses the ActionConf. The name of the ActionConf must match this value exactly or the app will not identify the ActionConf and will return the LLM error card.
The type of ActionConf that is defined. LLM ActionConfs expect a value of "llm."
A 1 or 0 value for whether the action is currently in operation. By default, the ActionConf is disabled (value of "1") and must be updated by a Salem admin.
Required inputs that match to a corresponding ActionDefinition. The definition must match the name of an ActionDefinition and "temperature" and "additional_instructions" must match the input_keys of the ActionDefinition or the request will fail.
definition
- The name defined in the "definition" must match the corresponding LLM's ActionDefintion, or the request will fail (in this case, Salem expects an ActionDefinition with the name "External LLM")
temperature
- Temperature is a hyperparameter that controls the randomness of the output of a large language model (LLM). It is a value between 0 and 1, with a lower temperature resulting in more deterministic and factual output, and a higher temperature resulting in more creative and imaginative output.
additional_instructions
- An editable prompt that will append to the end of the coded "instructions" for the Salem feature. This field can be an empty string.
outputs
- not in use