amazon_bedrock

View as MarkdownOpen in Claude

Create an Amazon Bedrock agent with a prompt. Since the text prompt is central to getting great results out of the AI, it is highly recommended that you also read the Prompting Best Practices guide.

Properties

amazon_bedrock
objectRequired

An object that accepts the following properties.

amazon_bedrock.global_data
object

A powerful and flexible environmental variable which can accept arbitrary data that is set initially in the SWML script or from the SWML set_global_data action. This data can be referenced globally. All contained information can be accessed and expanded within the prompt - for example, by using a template string.

amazon_bedrock.params
object

A JSON object containing parameters as key-value pairs.

amazon_bedrock.post_prompt
object

The final set of instructions and configuration settings to send to the agent. Accepts either a text string or a pom object array for structured prompts, plus optional tuning parameters. See post_prompt details below.

amazon_bedrock.post_prompt_url
string

The URL to which to send status callbacks and reports. Authentication can also be set in the url in the format of username:password@url. See post_prompt_url callback below.

amazon_bedrock.prompt
objectRequired

Establishes the initial set of instructions and settings to configure the agent.

See prompt for additional details.

amazon_bedrock.SWAIG
object

An array of JSON objects to create user-defined functions/endpoints that can be executed during the dialogue.

See SWAIG for additional details.

post_prompt

The post_prompt object accepts either a plain text prompt or a structured POM prompt, plus optional tuning parameters.

post_prompt.text
stringRequired

The main identity prompt for the AI. This prompt will be used to outline the agent’s personality, role, and other characteristics.

post_prompt.temperature
number

Controls the randomness of responses. Higher values (e.g., 0.8) make output more random and creative, while lower values (e.g., 0.2) make it more focused and deterministic. Range: 0.0 to 1.0.

post_prompt.top_p
number

Controls diversity via nucleus sampling. Only tokens with cumulative probability up to top_p are considered. Lower values make output more focused. Range: 0.0 to 1.0.

post_prompt.confidence
number

Minimum confidence threshold for AI responses. Responses below this threshold may be filtered or flagged. Range: 0.0 to 1.0.

post_prompt.presence_penalty
number

Penalizes tokens based on whether they appear in the text so far. Positive values encourage the model to talk about new topics.

post_prompt.frequency_penalty
number

Penalizes tokens based on their frequency in the text so far. Positive values decrease the likelihood of repeating the same line verbatim.

post_prompt_url callback

SignalWire will make a request to the post_prompt_url with the following parameters:

action
string

Action that prompted this request. The value will be “post_conversation”.

ai_end_date
integer

Timestamp indicating when the AI session ended.

ai_session_id
string

A unique identifier for the AI session.

ai_start_date
integer

Timestamp indicating when the AI session started.

app_name
string

Name of the application that originated the request.

call_answer_date
integer

Timestamp indicating when the call was answered.

call_end_date
integer

Timestamp indicating when the call ended.

call_id
string

ID of the call.

call_log
object

The complete log of the call, as a JSON object.

call_log.content
string

Content of the call log entry.

call_log.role
string

Role associated with the call log entry (e.g., “system”, “assistant”, “user”).

call_start_date
integer

Timestamp indicating when the call started.

caller_id_name
string

Name associated with the caller ID.

caller_id_num
string

Number associated with the caller ID.

content_disposition
string

Disposition of the content.

content_type
string

Type of content. The value will be text/swaig.

conversation_id
string

A unique identifier for the conversation thread, if configured via the AI parameters.

post_prompt_data
object

The answer from the AI agent to the post_prompt. The object contains the three following fields.

post_prompt_data.parsed
object

If a JSON object is detected within the answer, it is parsed and provided here.

post_prompt_data.raw
string

The raw data answer from the AI agent.

post_prompt_data.substituted
string

The answer from the AI agent, excluding any JSON.

project_id
string

ID of the Project.

space_id
string

ID of the Space.

SWMLVars
object

A collection of variables related to SWML.

swaig_log
object

A log related to SWAIG functions.

total_input_tokens
integer

Represents the total number of input tokens.

total_output_tokens
integer

Represents the total number of output tokens.

version
string

Version number.

Post prompt callback request example

Below is a json example of the callback request that is sent to the post_prompt_url:

1{
2 "total_output_tokens": 119,
3 "caller_id_name": "[CALLER_NAME]",
4 "SWMLVars": {
5 "ai_result": "success",
6 "answer_result": "success"
7 },
8 "call_start_date": 1694541295773508,
9 "project_id": "[PROJECT_ID]",
10 "call_log": [
11 {
12 "content": "[AI INITIAL PROMPT/INSTRUCTIONS]",
13 "role": "system"
14 },
15 {
16 "content": "[AI RESPONSE]",
17 "role": "assistant"
18 },
19 {
20 "content": "[USER RESPONSE]",
21 "role": "user"
22 }
23 ],
24 "ai_start_date": 1694541297950440,
25 "call_answer_date": 1694541296799504,
26 "version": "2.0",
27 "content_disposition": "Conversation Log",
28 "conversation_id": "[CONVERSATION_ID]",
29 "space_id": "[SPACE_ID]",
30 "app_name": "swml app",
31 "swaig_log": [
32 {
33 "post_data": {
34 "content_disposition": "SWAIG Function",
35 "conversation_id": "[CONVERSATION_ID]",
36 "space_id": "[SPACE_ID]",
37 "meta_data_token": "[META_DATA_TOKEN]",
38 "app_name": "swml app",
39 "meta_data": {},
40 "argument": {
41 "raw": "{\n \"target\": \"[TRANSFER_TARGET]\"\n}",
42 "substituted": "",
43 "parsed": [
44 {
45 "target": "[TRANSFER_TARGET]"
46 }
47 ]
48 },
49 "call_id": "[CALL_ID]",
50 "content_type": "text/swaig",
51 "ai_session_id": "[AI_SESSION_ID]",
52 "caller_id_num": "[CALLER_NUMBER]",
53 "caller_id_name": "[CALLER_NAME]",
54 "project_id": "[PROJECT_ID]",
55 "purpose": "Use to transfer to a target",
56 "argument_desc": {
57 "type": "object",
58 "properties": {
59 "target": {
60 "description": "the target to transfer to",
61 "type": "string"
62 }
63 }
64 },
65 "function": "transfer",
66 "version": "2.0"
67 },
68 "command_name": "transfer",
69 "epoch_time": 1694541334,
70 "command_arg": "{\n \"target\": \"[TRANSFER_TARGET]\"\n}",
71 "url": "https://example.com/here",
72 "post_response": {
73 "action": [
74 {
75 "say": "This is a say message!"
76 },
77 {
78 "SWML": {
79 "sections": {
80 "main": [
81 {
82 "connect": {
83 "to": "+1XXXXXXXXXX"
84 }
85 }
86 ]
87 },
88 "version": "1.0.0"
89 }
90 },
91 {
92 "stop": true
93 }
94 ],
95 "response": "transferred to [TRANSFER_TARGET], the call has ended"
96 }
97 }
98 ],
99 "total_input_tokens": 5627,
100 "caller_id_num": "[CALLER_NUMBER]",
101 "call_id": "[CALL_ID]",
102 "call_end_date": 1694541335435503,
103 "content_type": "text/swaig",
104 "action": "post_conversation",
105 "post_prompt_data": {
106 "substituted": "[SUMMARY_MESSAGE_PLACEHOLDER]",
107 "parsed": [],
108 "raw": "[SUMMARY_MESSAGE_PLACEHOLDER]"
109 },
110 "ai_end_date": 1694541335425164,
111 "ai_session_id": "[AI_SESSION_ID]"
112}

Responding to post prompt requests

The response to the callback request should be a JSON object with the following parameters:

1{
2 "response": "ok"
3}

Amazon Bedrock example

The following example selects Bedrock’s Tiffany voice using the voice_id parameter in the prompt. It includes scaffolding for a post_prompt_url as well as several remote and inline functions using SWAIG.

1---
2version: 1.0.0
3sections:
4 main:
5 - amazon_bedrock:
6 post_prompt_url: https://example.com/my-api
7 prompt:
8 voice_id: tiffany
9 text: |
10 You are a helpful assistant that can provide information to users about a destination.
11 At the start of the conversation, always ask the user for their name.
12 You can use the appropriate function to get the phone number, address,
13 or weather information.
14 post_prompt:
15 text: Summarize the conversation.
16 SWAIG:
17 includes:
18 - functions:
19 - get_phone_number
20 - get_address
21 url: https://example.com/functions
22 defaults:
23 web_hook_url: https://example.com/my-webhook
24 functions:
25 - function: get_weather
26 description: To determine what the current weather is in a provided location.
27 parameters:
28 properties:
29 location:
30 type: string
31 description: The name of the city to find the weather from.
32 type: object
33 - function: summarize_conversation
34 description: Summarize the conversation.
35 parameters:
36 type: object
37 properties:
38 name:
39 type: string
40 description: The name of the user.