LlmComponentContext

Lib~ LlmComponentContext

The Bots LlmComponentContext is a class with convenience methods when working wih large language models

An LlmComponentContext class instance is passed as an argument to every llm event handler function.

Constructor

new LlmComponentContext(request)

Description:
  • Constructor of rest service context. DO NOT USE - INSTANCE IS ALREADY PASSED TO EVENT HANDLERS
Source:
Parameters:
Name Type Description
request object

Extends

Methods

addJSONSchemaFormattingInstruction()

Description:
  • Enriches the system prompt with JSON schema formatting instruction
Source:

addMessage(payload)

Description:
  • Adds a message to the bot response sent to the user.
Source:
Parameters:
Name Type Description
payload object can take a string message, or a message created using the MessageFactory

constructMessagePayload(payload) → {object}

Description:
  • Creates a message payload object
Source:
Overrides:
Parameters:
Name Type Description
payload object can take a string message, a message created by the MessageFactory, or a message created by the deprecated MessageModel.
Returns:
message payload in JSON format
Type
object

convertToJSON(message) → {object|undefined}

Description:
  • Converts the message to a JSON object: - it first search the first occurrence of open-curly-bracket '{' and last occurrence of close-curly-bracket '}' - it then tries to parse the message between the open and close curly brackets into a JSON object. - if parsing is successful, the JSON object is returned, otherwise the method returns undefined
Source:
Parameters:
Name Type Description
message string the message to convert
Returns:
the parsed message, or undefined
Type
object | undefined

createLLMPromptAction(label, prompt) → {PostbackAction}

Description:
  • Create a postback action that sends a new prompt to the LLM.

Source:
Parameters:
Name Type Description
label string the label of the postback action button
prompt string the text of the prompt
Returns:
the postback action
Type
PostbackAction

getChannelType() → {string}

Description:
  • Return the channel conversation type
Source:
Overrides:
Returns:
the channel type
Type
string

getChatHistory() → {Array.<ChatEntry>}

Description:
  • Array of chat messages that are exchanged with the LLM. Each message has the following properties: - role: the role under which the message is sent: system, user or assistant - content: the message text - turn: number indicating the refinement turn of the chat messages exchange
Source:
Returns:
the chat messages
Type
Array.<ChatEntry>

getCurrentTurn() → {number}

Description:
  • Returns number indicating the current refinement turn of the chat messages exchange. When the first prompt is sent to the LLM the turn is 1.
Source:
Returns:
the turn
Type
number

getCustomProperty(name) → {object}

Description:
  • Returns the value of a custom property that is stored in the LLM context. A custom property can be used to maintain custom state accross event handler calls while interacting with the LLM within the current state in visual flow designer.
Source:
Parameters:
Name Type Description
name string name of the custom property
Returns:
value of the custom property
Type
object

getInvalidResponseTemplate() → {string}

Description:
  • Returns the template used to send a retry prompt to the LLM when validation errors have been found.
Source:
Returns:
the instrution
Type
string

getJsonSchema() → {object}

Description:
  • Returns the JSON schema used to validate the LLM response
Source:
Returns:
the json schema
Type
object

getJsonSchemaInstructionTemplate() → {string}

Description:
  • Returns the template used to enrich the system prompt with instructions to comply with a JSON schema
Source:
Returns:
the instrution
Type
string

getLastAssistantMessage() → {ChatEntry}

Description:
  • Returns the last response sent by the LLM
Source:
Returns:
the message
Type
ChatEntry

getLastUserMessage() → {ChatEntry}

Description:
  • Returns the last message sent by the user
Source:
Returns:
the message
Type
ChatEntry

getLogger() → {object}

Description:
  • Retrieves the logger object.
Source:
Overrides:
Deprecated:
  • use logger() function instead
Returns:
The logger object.
Type
object

getMaxRetries() → {number}

Description:
  • Returns the maximum number of retry prompts that will be sent to the LLM when the response is invalid
Source:
Returns:
the maximum number
Type
number

getMessageFactory() → {MessageFactory}

Description:
  • Returns the MessageFactory class for creating bots messages
Source:
Overrides:
Returns:
The MessageFactory class
Type
MessageFactory

getMessageModel() → {MessageModel}

Description:
  • Returns the MessageModel class for creating or validating messages to or from bots.
Source:
Overrides:
Deprecated:
  • Use getMessageFactory() instead
See:
  • MessageModel.js
Returns:
The MessageModel class
Type
MessageModel

getRequest() → {object}

Description:
  • Retrieves the request object.
Source:
Overrides:
Returns:
The request object.
Type
object

getResponse() → {object}

Description:
  • Retrieves the response object.
Source:
Overrides:
Returns:
The response object.
Type
object

getResultVariable() → {object}

Description:
  • Get the value of the LLM result variable

Source:
Returns:
the result value
Type
object

getRetries()

Description:
  • Returns the number of retry prompts that have been sent to the LLM since the last successful LLM response
Source:

getRetryUserMessage() → {string}

Description:
  • Returns the status message that is sent to the user when the LLM is invoked with a retry prompt
Source:
Returns:
the message
Type
string

getSystemPrompt() → {string}

Description:
  • Returns the LLM system prompt
Source:
Returns:
the prompt
Type
string

getUserMessage() → {NonRawMessage}

Description:
  • Returns the last user message.
Source:
Overrides:
Returns:
the last user message. You can cast this message to the appropriate message type.
Type
NonRawMessage

getVariable(name) → {object}

Description:
  • Returns the value of a context or user variable
Source:
Overrides:
Parameters:
Name Type Description
name string name of the variable
Returns:
variable value
Type
object

getVariableDefinition(name)

Description:
  • Get the definition of a variable
Source:
Overrides:
Parameters:
Name Type Description
name string The name of the variable

handleInvalidResponse(errors) → {false}

Description:
  • Handles an invalid LLM response by sending a retry prompt to the LLM if the maximum number of retries has not been reached yet. If maximum number of retries is reached, a validation error is set which results in a transition out of the LLM component using the 'error' transition.

Source:
Parameters:
Name Type Description
errors Array.<string> messages describing what is invalid about the response
Returns:
always returns false
Type
false

isJsonValidationEnabled()

Description:
  • Returns true when JSON formatting is enabled in the associated Invoke LLM state in Visual Fow Designer
Source:

logger() → {object}

Description:
  • Retrieves the logger object.
Source:
Overrides:
Returns:
The logger object.
Type
object

nlpResult(nlpVariableNameopt) → {NLPResult}

Description:
  • Returns an NLPResult helper object for working with nlpresult variables. See the NLPResult documentation for more information.

    If your skill uses visual flows, you don't need to specify a variable name. If your skill uses a YAML flow, you may specify a particular nlpresult by name (if you have multiple nlpresult variables defined in the flow), or omit the name if you only have 1 nlpresult.

Source:
Overrides:
Parameters:
Name Type Attributes Description
nlpVariableName string <optional>
variable that holds the nlpResult
Returns:
The nlp resolution result.
Type
NLPResult

setCustomProperty(name, value)

Description:
  • Sets the value of a custom property that is stored in the LLM context. A custom property can be used to maintain custom state accross event handler calls while interacting with the LLM within the current state in visual flow designer. If you set the value to null, the custom property will be removed.
Source:
Parameters:
Name Type Description
name string name of the custom property
value object value of the custom property

setNextLLMPrompt(prompt, isRetry)

Description:
  • Sets an LLM prompt that will be sent next to the LLM

Source:
Parameters:
Name Type Description
prompt string the text of the prompt
isRetry boolean is the prompt used to try to fix a prior invalid LLM response

setResultVariable(result)

Description:
  • Set the value of the LLM result variable

Source:
Parameters:
Name Type Description
result object the value

setTransitionAction(action)

Description:
  • Set a transition action. When you use this function, the dialog engine will transition to the state defined for this transition action.

Source:
Parameters:
Name Type Description
action string name of the transition action

setValidationError(errorMessage, errorCode)

Description:
  • Set the request or LLM response validation error
Source:
Parameters:
Name Type Description
errorMessage string the error message
errorCode string allowable values: 'requestFlagged', 'responseFlagged', 'requestInvalid', 'responseInvalid', 'modelLengthExceeded'

setVariable(name, value)

Description:
  • Sets the value of a context or user variable
Source:
Overrides:
Parameters:
Name Type Description
name string name of the variable
value object value of the variable

translate(rbKey, …rbArgs) → {string}

Description:
  • Get translated string using a resource bundle key defined in the skill.
Source:
Overrides:
Parameters:
Name Type Attributes Description
rbKey string key of the resource bundle entry defined with the skill that should be used to translate
rbArgs string <repeatable>
substitution variables
Returns:
resource bundle freemarker expression that will be resolved when event handler or custom component response is received by dialog engine
Type
string

updateLastAssistantMessage(message)

Description:
  • Update the last LLM response message
Source:
Parameters:
Name Type Description
message string the new message

updateSystemPrompt(prompt)

Description:
  • Update the LLM system prompt
Source:
Parameters:
Name Type Description
prompt string the new prompt

variable(name, valueopt)

Description:
  • Read or write variables defined in the current flow. It is not possible to change the type of an existing variable through this method. It is the caller's responsibility to ensure that the value being set on a variable is of the correct type. (e.g. entity, string or other primitive, etc).

    A new variable can be created. However, since the variable is not defined in the flow, using it in the flow subsequently may be flagged for validation warnings.

    This function takes a variable number of arguments.

    The first form: variable(name); reads the variable called "name", returning its value. The name could be in the form of <scope>.<variableName>. For example, a variable firstName in the profile scope needs to be retrieved as variable("profile.firstName").

    The second form: variable(name, value); writes the value "value" to the variable called "name".
Source:
Overrides:
Example
let firstName = conversation.variable("profile.firstName");
let lastName = conversation.variable("profile.lastName");
conversation.variable("fullName", firstName + ' ' + lastName);
Parameters:
Name Type Attributes Description
name string The name of variable to be set or read
value string <optional>
value to be set for variable