***
id: a1f9b9bc-2467-4d55-af2a-3d8fe9de2c51
title: live\_translate
slug: /reference/live-translate
description: Translate a voice interaction in real-time.
max-toc-depth: 3
----------------
Start live translation of the call. The translation will be sent to the specified webhook URL.
## **Properties**
An object that accepts the following properties.
The action to perform. See [actions](#actions) below.
## **Actions**
The `action` property controls the translation session. Use `start` to begin translating with configuration options, `stop` to end an active session, `summarize` to request an on-demand AI summary mid-session, or `inject` to insert a translated message into the conversation.
Start a live translation session.
The URL to receive translation events via HTTP POST.
When `live_events` is enabled, partial results are sent as they occur.
When `ai_summary` is enabled, summaries in both languages are sent when the session ends.
Authentication can also be set in the URL in the format of `username:password@url`.
The language to translate from.
See [supported voices & languages](/docs/platform/voice/tts).
The language to translate to.
See [supported voices & languages](/docs/platform/voice/tts).
The TTS voice to use for the source language.
See [supported voices & languages](/docs/platform/voice/tts).
The TTS voice to use for the target language.
See [supported voices & languages](/docs/platform/voice/tts).
Translation filter to apply to the source language direction.
Adjusts the tone or style of translated speech.
Preset values: `polite` (removes insults, maintains sentiment), `rude` (adds insults, maintains sentiment),
`professional` (removes slang), `shakespeare` (iambic pentameter), `gen-z` (Gen-Z slang and expressions).
For custom filters, use the `prompt:` prefix (e.g., `prompt:Use formal business language`).
Translation filter to apply to the target language direction.
Adjusts the tone or style of translated speech.
Preset values: `polite`, `rude`, `professional`, `shakespeare`, `gen-z`.
For custom filters, use the `prompt:` prefix.
Whether to enable live events.
Whether to enable automatic AI summarization.
When enabled, AI-generated summaries in both languages will be sent to your webhook when the translation session ends.
The timeout for speech recognition in milliseconds. Minimum value: `1500`.
Voice activity detection silence time in milliseconds.
Default depends on the speech engine: `300` for Deepgram, `500` for Google. Minimum value: `1`.
Voice activity detection threshold. Range: `0` to `1800`.
Debug level for logging.
The direction of the call that should be translated. Possible values: `remote-caller`, `local-caller`.
The speech recognition engine to use. Possible values: `deepgram`, `google`.
The AI prompt that instructs how to summarize the conversation when `ai_summary` is enabled.
This prompt is sent to an AI model to guide how it generates the summary.
Set `action` to the string `"stop"` to stop the live translation session.
This action is designed for use on active calls that have an existing translation session running.
You can send this action via the [Call Commands REST API](/docs/apis/calling/calls/call-commands),
or include it in a SWML section executed via [`transfer`](/docs/swml/reference/transfer) or [`execute`](/docs/swml/reference/execute) during a call.
If you want automatic summarization when the session ends instead, use `ai_summary: true` in the `start` action.
Request an on-demand AI summary of the conversation while translation is still active.
This action is designed for use on active calls that have an existing translation session running.
You can send this action via the [Call Commands REST API](/docs/apis/calling/calls/call-commands),
or include it in a SWML section executed via [`transfer`](/docs/swml/reference/transfer) or [`execute`](/docs/swml/reference/execute) during a call.
If called with an empty object, the default summarization prompt and webhook will be used.
The webhook URI to be called. Authentication can also be set in the URL in the format of `username:password@url`.
The AI prompt that instructs the AI model how to summarize the conversation.
This guides the style and content of the generated summary.
Inject a message into the conversation to be translated and spoken to the specified party.
This action is designed for use on active calls that have an existing translation session running.
You can send this action via the [Call Commands REST API](/docs/apis/calling/calls/call-commands),
or include it in a SWML section executed via [`transfer`](/docs/swml/reference/transfer) or [`execute`](/docs/swml/reference/execute) during a call.
The message to be injected.
The direction of the message. Possible values: `remote-caller`, `local-caller`.
## Action usage context
| Action | Call start | Live call |
| ----------- | ------------------------- | -------------------- |
| `start` | ✅ Primary use | ✅ Can start mid-call |
| `stop` | ❌ No session to stop | ✅ Designed for this |
| `summarize` | ❌ No content to summarize | ✅ Designed for this |
| `inject` | ❌ No session exists | ✅ Designed for this |
**Call start:** The initial SWML document returned when a call first arrives.
**Live call:** Actions sent to active calls via the [Call Commands REST API](/docs/apis/calling/calls/call-commands) or SWML sections executed via [`transfer`](/docs/swml/reference/transfer) or [`execute`](/docs/swml/reference/execute) during a call.
* **`ai_summary: true`** (in `start`): Automatically generates summary when session **ends**
* **`summarize` action**: On-demand summary **during** an active session
## **Examples**
```yaml
version: 1.0.0
sections:
main:
- answer: {}
- live_translate:
action:
start:
webhook: 'https://example.com/webhook'
from_lang: en-US
to_lang: es-ES
from_voice: elevenlabs.josh
to_voice: elevenlabs.josh
live_events: true
direction:
- remote-caller
- local-caller
speech_engine: deepgram
```
```json
{
"version": "1.0.0",
"sections": {
"main": [
{ "answer": {} },
{
"live_translate": {
"action": {
"start": {
"webhook": "https://example.com/webhook",
"from_lang": "en-US",
"to_lang": "es-ES",
"from_voice": "elevenlabs.josh",
"to_voice": "elevenlabs.josh",
"live_events": true,
"direction": ["remote-caller", "local-caller"],
"speech_engine": "deepgram"
}
}
}
}
]
}
}
```
```yaml
live_translate:
action: stop
```
```json
{
"live_translate": {
"action": "stop"
}
}
```
```yaml
live_translate:
action:
summarize:
webhook: 'https://example.com/webhook'
prompt: Summarize the key points of this conversation.
```
```json
{
"live_translate": {
"action": {
"summarize": {
"webhook": "https://example.com/webhook",
"prompt": "Summarize the key points of this conversation."
}
}
}
}
```
```yaml
live_translate:
action:
inject:
message: This is an injected message.
direction: remote-caller
```
```json
{
"live_translate": {
"action": {
"inject": {
"message": "This is an injected message.",
"direction": "remote-caller"
}
}
}
}
```