Agents

BedrockAgent

View as MarkdownOpen in Claude

BedrockAgent extends AgentBase to use Amazon Bedrock’s voice-to-voice model as the AI backend. It generates SWML with the amazon_bedrock verb instead of ai, while maintaining full compatibility with all standard agent features: prompts (text and POM), skills, SWAIG functions, post-prompt, and dynamic configuration.

Extends AgentBase — inherits all parent properties and methods.

BedrockAgent generates SWML with the amazon_bedrock verb instead of ai. See the SWML bedrock reference for the full specification.

Properties

name
strDefaults to bedrock_agent

Agent name.

route
strDefaults to /bedrock

HTTP route for the agent endpoint.

system_prompt
str

Initial system prompt. Can be overridden later with set_prompt_text().

voice_id
strDefaults to matthew

Bedrock voice identifier (e.g., "matthew", "joanna").

temperature
floatDefaults to 0.7

Generation temperature. Range: 0 to 1.

top_p
floatDefaults to 0.9

Nucleus sampling parameter. Range: 0 to 1.

max_tokens
intDefaults to 1024

Maximum tokens to generate per response.

**kwargs
Any

Additional arguments passed to the AgentBase constructor (e.g., host, port).

Methods

Overridden Behavior

BedrockAgent overrides several AgentBase methods to adapt for the Bedrock voice-to-voice model:

MethodBehavior
set_llm_model()Logs a warning and does nothing. Bedrock uses a fixed voice-to-voice model.
set_llm_temperature()Redirects to set_inference_params(temperature=...).
set_prompt_llm_params()Logs a warning. Use set_inference_params() instead.
set_post_prompt_llm_params()Logs a warning. Bedrock post-prompt uses OpenAI configured in the platform.

Parameters specific to text-based LLMs (barge_confidence, presence_penalty, frequency_penalty) are automatically filtered out during SWML rendering and have no effect on Bedrock agents.

Prompt methods (set_prompt_text(), set_prompt_pom(), prompt_add_section(), etc.) work normally. The prompt structure is built the same way as AgentBase and then included in the amazon_bedrock verb.


Examples

Basic Bedrock agent with a tool

1from signalwire import BedrockAgent
2from signalwire import FunctionResult
3
4agent = BedrockAgent(
5 name="bedrock-assistant",
6 route="/assistant",
7 system_prompt="You are a helpful customer service representative.",
8 voice_id="joanna",
9 temperature=0.5
10)
11
12# Use all standard AgentBase features
13agent.prompt_add_section("Guidelines", "Be concise and professional.")
14agent.add_language("English", "en-US", "rime.spore")
15
16@agent.tool(
17 description="Look up order status",
18 parameters={
19 "type": "object",
20 "properties": {
21 "order_id": {"type": "string", "description": "Order ID"}
22 },
23 "required": ["order_id"]
24 }
25)
26def check_order(args, raw_data):
27 order_id = args.get("order_id", "")
28 return FunctionResult(f"Order {order_id} is shipped and arriving tomorrow.")
29
30# Adjust inference parameters at runtime
31agent.set_voice("matthew")
32agent.set_inference_params(temperature=0.3, top_p=0.95)
33
34if __name__ == "__main__":
35 agent.run()

Multi-agent server with Bedrock

1from signalwire import AgentBase, AgentServer, BedrockAgent
2
3# Standard OpenAI-backed agent
4standard_agent = AgentBase(name="standard", route="/standard")
5standard_agent.set_prompt_text("You are a general assistant.")
6
7# Bedrock voice-to-voice agent
8bedrock_agent = BedrockAgent(
9 name="bedrock",
10 route="/bedrock",
11 system_prompt="You are a voice-optimized assistant.",
12 voice_id="matthew"
13)
14
15server = AgentServer()
16server.register(standard_agent)
17server.register(bedrock_agent)
18
19if __name__ == "__main__":
20 server.run()