***

title: amazon_bedrock
slug: /reference/python/relay/call/amazon-bedrock
description: Connect a call to an Amazon Bedrock AI agent.
max-toc-depth: 3
---------------------

For a complete index of all SignalWire documentation pages, fetch https://signalwire.com/docs/llms.txt

[ai]: /docs/server-sdks/reference/python/relay/call/ai

Connect the call to an Amazon Bedrock AI agent. Similar to
[`ai()`][ai] but uses Amazon Bedrock as
the LLM backend.

## **Parameters**

<ParamField path="prompt" type="Optional[Any]" toc={true}>
  The prompt configuration for the Bedrock agent.
</ParamField>

<ParamField path="SWAIG" type="Optional[dict]" toc={true}>
  SWAIG configuration for tool/function definitions.
</ParamField>

<ParamField path="ai_params" type="Optional[dict]" toc={true}>
  AI parameters for the Bedrock session.
</ParamField>

<ParamField path="global_data" type="Optional[dict]" toc={true}>
  Data accessible to the AI and SWAIG functions.
</ParamField>

<ParamField path="post_prompt" type="Optional[dict]" toc={true}>
  Post-prompt configuration.
</ParamField>

<ParamField path="post_prompt_url" type="Optional[str]" toc={true}>
  URL to receive the post-prompt result.
</ParamField>

## **Returns**

`dict` -- Server response confirming the Bedrock session.

## **Example**

```python {15}
from signalwire.relay import RelayClient

client = RelayClient(
    project="your-project-id",
    token="your-api-token",
    host="your-space.signalwire.com",
    contexts=["default"],
)

@client.on_call
async def handle_call(call):
    await call.answer()

    # Start an Amazon Bedrock AI agent
    result = await call.amazon_bedrock(
        prompt={"text": "You are a helpful assistant."},
        ai_params={"barge_confidence": 0.02},
    )

client.run()
```