***

title: BedrockAgent
slug: /reference/python/agents/bedrock-agent
description: Amazon Bedrock voice-to-voice agent extending AgentBase.
max-toc-depth: 3
---------------------

For a complete index of all SignalWire documentation pages, fetch https://signalwire.com/docs/llms.txt

[agentbase]: /docs/server-sdks/reference/python/agents/agent-base

[amazon-bedrock]: /docs/swml/reference/amazon-bedrock

[swml-bedrock-reference]: /docs/swml/reference/amazon-bedrock

[setvoice]: /docs/server-sdks/reference/python/agents/bedrock-agent/set-voice

[setinferenceparams]: /docs/server-sdks/reference/python/agents/bedrock-agent/set-inference-params

BedrockAgent extends [`AgentBase`][agentbase]
to use Amazon Bedrock's voice-to-voice model as the AI backend. It generates SWML
with the `amazon_bedrock` verb instead of `ai`, while maintaining full
compatibility with all standard agent features: prompts (text and POM), skills,
SWAIG functions, post-prompt, and dynamic configuration.

Extends [`AgentBase`][agentbase] -- inherits
all parent properties and methods.

<Info>
  BedrockAgent generates SWML with the [`amazon_bedrock`][amazon-bedrock] verb
  instead of `ai`. See the [SWML bedrock reference][swml-bedrock-reference] for the
  full specification.
</Info>

## **Properties**

<ParamField path="name" type="str" default="bedrock_agent" toc={true}>
  Agent name.
</ParamField>

<ParamField path="route" type="str" default="/bedrock" toc={true}>
  HTTP route for the agent endpoint.
</ParamField>

<ParamField path="system_prompt" type="str" toc={true}>
  Initial system prompt. Can be overridden later with `set_prompt_text()`.
</ParamField>

<ParamField path="voice_id" type="str" default="matthew" toc={true}>
  Bedrock voice identifier (e.g., `"matthew"`, `"joanna"`).
</ParamField>

<ParamField path="temperature" type="float" default="0.7" toc={true}>
  Generation temperature. Range: 0 to 1.
</ParamField>

<ParamField path="top_p" type="float" default="0.9" toc={true}>
  Nucleus sampling parameter. Range: 0 to 1.
</ParamField>

<ParamField path="max_tokens" type="int" default="1024" toc={true}>
  Maximum tokens to generate per response.
</ParamField>

<ParamField path="**kwargs" type="Any" toc={true}>
  Additional arguments passed to the AgentBase constructor (e.g., `host`,
  `port`).
</ParamField>

## **Methods**

<CardGroup cols={2}>
  <Card title="set_voice" href="/docs/server-sdks/reference/python/agents/bedrock-agent/set-voice">
    Set the Bedrock voice ID after construction.
  </Card>

  <Card title="set_inference_params" href="/docs/server-sdks/reference/python/agents/bedrock-agent/set-inference-params">
    Update Bedrock inference parameters.
  </Card>
</CardGroup>

## **Overridden Behavior**

BedrockAgent overrides several AgentBase methods to adapt for the Bedrock
voice-to-voice model:

| Method                         | Behavior                                                                    |
| ------------------------------ | --------------------------------------------------------------------------- |
| `set_llm_model()`              | Logs a warning and does nothing. Bedrock uses a fixed voice-to-voice model. |
| `set_llm_temperature()`        | Redirects to `set_inference_params(temperature=...)`.                       |
| `set_prompt_llm_params()`      | Logs a warning. Use `set_inference_params()` instead.                       |
| `set_post_prompt_llm_params()` | Logs a warning. Bedrock post-prompt uses OpenAI configured in the platform. |

<Warning>
  Parameters specific to text-based LLMs (`barge_confidence`, `presence_penalty`,
  `frequency_penalty`) are automatically filtered out during SWML rendering and
  have no effect on Bedrock agents.
</Warning>

Prompt methods (`set_prompt_text()`, `set_prompt_pom()`, `prompt_add_section()`,
etc.) work normally. The prompt structure is built the same way as AgentBase
and then included in the `amazon_bedrock` verb.

***

## **Examples**

### Basic Bedrock agent with a tool

```python {4}
from signalwire import BedrockAgent
from signalwire import FunctionResult

agent = BedrockAgent(
    name="bedrock-assistant",
    route="/assistant",
    system_prompt="You are a helpful customer service representative.",
    voice_id="joanna",
    temperature=0.5
)

# Use all standard AgentBase features
agent.prompt_add_section("Guidelines", "Be concise and professional.")
agent.add_language("English", "en-US", "rime.spore")

@agent.tool(
    description="Look up order status",
    parameters={
        "type": "object",
        "properties": {
            "order_id": {"type": "string", "description": "Order ID"}
        },
        "required": ["order_id"]
    }
)
def check_order(args, raw_data):
    order_id = args.get("order_id", "")
    return FunctionResult(f"Order {order_id} is shipped and arriving tomorrow.")

# Adjust inference parameters at runtime
agent.set_voice("matthew")
agent.set_inference_params(temperature=0.3, top_p=0.95)

if __name__ == "__main__":
    agent.run()
```

### Multi-agent server with Bedrock

```python {8}
from signalwire import AgentBase, AgentServer, BedrockAgent

# Standard OpenAI-backed agent
standard_agent = AgentBase(name="standard", route="/standard")
standard_agent.set_prompt_text("You are a general assistant.")

# Bedrock voice-to-voice agent
bedrock_agent = BedrockAgent(
    name="bedrock",
    route="/bedrock",
    system_prompt="You are a voice-optimized assistant.",
    voice_id="matthew"
)

server = AgentServer()
server.register(standard_agent)
server.register(bedrock_agent)

if __name__ == "__main__":
    server.run()
```