Salem Cyber Doc Site
  • 🏠Documentation Home
  • ✨Initiation Guides
    • Quickstart: Deploy Salem
    • Admin Guide
    • Installing Teams App
    • Installing Browser Extension
    • Feature Overview
  • ✨General Guides
    • Managing Alerts
    • Managing Questions
    • Threat Notification Management
    • Uploading Files
    • Logical Operations
  • ✨Configurations Specification
    • Configuration Home
    • Action Conf
      • "match" ActionConfs
      • "webhook" ActionConfs
      • "llm" ActionConfs
    • Action Definition
      • Azure Log Analytics
      • Microsoft Graph API
      • Splunk Search
      • Bring Your Own LLM
    • Parsing Conf
      • Summary Details
    • Report Conf
    • LLM Configuration
  • 💾Changelog
    • Dec 5th '24: Get cracking on your holiday shopping list
    • July 18th, ’24: Beat the heat and the hackers
    • Apr 17th, '24: Alert showers make analysts sour... no longer with Salem!
    • Mar 5, '24: They're after me (and your) secure systems! We're na-tur-ally suspicious
    • Jan 31, '24: New year, new me... and a new way to extract data from your alerts
    • Dec 21, '23: Jingle bells, WannaCry smells, your escalated alert just laid an egg
    • Nov 14, '23: Stuff the turkey or stuff cyber alerts with context... Why not both?
    • Oct 25, '23: Llama, llama on the wall which alert is scariest of them all
    • Sept 19, '23: Context building via true positive/false positive workflow
    • Sept 1, '23: Alert report UI, webhook actions, and question upgrades
Powered by GitBook
On this page
  • Supported LLMs
  • Pre-requisites
  • Configuring Your LLM for Ask Salem
  • 1. LLM ActionDefinition Name
  • 2. Keyvault Authentication
  • 3. LLM_type
  • 4. Input - Input_keys
  • Troubleshooting
  1. Configurations Specification
  2. Action Definition

Bring Your Own LLM

PreviousSplunk SearchNextParsing Conf

Last updated 1 year ago

This page provides information on how to configure customer-provided LLMs with Salem.

  • OpenAI - Chat GPT

  • An API-key for the organization's chosen LLM

The image below is Salem's default LLM ActionDefinition. More general information on ActionDefinition parameters and their purpose can be found in the documentation.

See the description below the image for an explanation of how to configure the ActionDefinition for an LLM integration.

The ActionDefinition name must be the same as the name referenced by the corresponding ActionConf, or the request will fail. By default, Salem uses the name "External LLM."

  • secret_name: Secret containing the LLM's API secret value.

As of v1.5.2, the LLM configuration expects an "llm_type" of "OpenAI" with the version defined in the static_keys parameter.

The input_keys defined in the ActionDefinition must match the keys in the params object in the corresponding ActionConf. In the default ActionConf, the required params are:

  • temperature - The temperature to be defined in the ActionConf

  • additional_instructions - Additional instructions to be defined in the ActionConf

If an LLM has not been setup or has been incorrectly configured, an admin user will receive the following message when attempting to use an LLM feature:

Tips

  • Make sure the ActionDefinition referenced in the "definition" of the params object of the ActionConf matches the ActionDefinition

  • For the Ask Salem feature: make sure the ActionConf is called "AskSalem"

  • Check to make the parameters referenced in the ActionConf matches the parameters in the input_keys of the ActionDefinition.

Authentication relies on your organization's default key vault resource that must have a secret that contains the secret value for your LLM. For more information on setting up ActionConf requests, reference the documentation.

✨
ActionConf
1.
LLM ActionDefinition Name
2.
Keyvault Authentication
3. LLM_type
4. Input - Input_keys
Troubleshooting
ActionDefinition
Supported LLMs
Pre-requisites
Configuring Your LLM for Ask Salem