*** id: 50bbedaf-3e1c-44f0-9021-4f54de00029f hide\_title: false slug: /reference/ai title: ai description: Create an AI agent to interact with users. max-toc-depth: 3 ---------------- [languages]: /docs/swml/reference/ai/languages [params]: /docs/swml/reference/ai/params [prompt]: /docs/swml/reference/ai/prompt [SWAIG]: /docs/swml/reference/ai/swaig [set_global_data action]: /docs/swml/reference/ai/swaig/functions/data-map#list-of-valid-actions [Prompting Best Practices]: /docs/platform/ai/best-practices#crafting-the-initial-prompt-for-the-ai [markdown-guide]: https://www.markdownguide.org/ Creates an AI agent that conducts voice conversations using automatic speech recognition (ASR), large language models (LLMs), and text-to-speech (TTS) synthesis. The agent processes caller speech in real-time, generates contextually appropriate responses, and can execute custom functions to interact with external systems and databases through [SignalWire AI Gateway (SWAIG)][SWAIG]. Since the [prompt] configuration is central to AI agent behavior, it is recommended to read the [Prompting Best Practices][Prompting Best Practices] guide. ## **Properties** An object that defines an AI agent for conducting voice conversations. Accepts the following properties to configure the agent's prompt, behavior, functions, language support, and other settings. Defines the AI agent's personality, goals, behaviors, and instructions for handling conversations. The prompt establishes how the agent should interact with callers, what information it should gather, and how it should respond to various scenarios. It is recommended to write [prompts][prompt] using [markdown formatting][markdown-guide] as LLMs better understand structured content. Additionally it is recommended to read the [Prompting Best Practices][Prompting Best Practices] guide. A key-value object for storing data that persists throughout the AI session. Can be set initially in the SWML script or modified during the conversation using the [`set_global_data`][set_global_data action] action. The `global_data` object is accessible everywhere in the AI session: prompts, AI parameters, and SWML returned from SWAIG functions. Access properties using template strings (e.g `${global_data.property_name}`) Provide an array of strings and/or objects to guide the AI's pronunciation and understanding of specific words or phrases. Words that can commonly be mispronounced can be added to the hints to help the AI speak more accurately. **Hints as strings:** Each string in the array gives the AI context on how to interpret certain words. For example, if a user says `Toni` and the hint is `Tony`, the AI understands that the user said `Tony`. **Hints as objects:** An array of objects with the properties below to customize how the AI handles specific words. The hint to match. This will match the string exactly as provided. A regular expression to match the hint against. This will ensure that the hint has a valid matching pattern before being replaced. The text to replace the hint with. This will replace the portion of the hint that matches the pattern. If true, the hint will be matched in a case-insensitive manner. Defaults to false. An array of JSON objects defining supported languages in the conversation. See [languages] for more details. A JSON object containing parameters as key-value pairs. See [params] for more details. The final set of instructions and configuration settings to send to the agent. The instructions to send to the agent. Randomness setting. Float value between 0.0 and 1.5. Closer to 0 will make the output less random. Randomness setting. Alternative to `temperature`. Float value between 0.0 and 1.0. Closer to 0 will make the output less random. Threshold to fire a speech-detect event at the end of the utterance. Float value between 0.0 and 1.0. Decreasing this value will reduce the pause after the user speaks, but may introduce false positives. Aversion to staying on topic. Float value between -2.0 and 2.0. Positive values increase the model's likelihood to talk about new topics. Aversion to repeating lines. Float value between -2.0 and 2.0. Positive values decrease the model's likelihood to repeat the same line verbatim. The URL to which to send status callbacks and reports. Authentication can also be set in the url in the format of `username:password@url`. See [post\_prompt\_url callback](#post_prompt_url-callback) below. An array of objects to clarify the AI's pronunciation of certain words or expressions. The expression to replace. The phonetic spelling of the expression. Whether the pronunciation replacement should ignore case. An array of JSON objects to create user-defined functions/endpoints that can be executed during the dialogue. See [SWAIG] for more details. ## post\_prompt\_url callback SignalWire will make a request to the `post_prompt_url` with the following parameters: Action that prompted this request. The value will be "post\_conversation". Timestamp indicating when the AI session ended. A unique identifier for the AI session. Timestamp indicating when the AI session started. Name of the application that originated the request. Timestamp indicating when the call was answered. Timestamp indicating when the call ended. ID of the call. The complete log of the call, as a JSON object. Content of the call log entry. Role associated with the call log entry (e.g., "system", "assistant", "user"). Timestamp indicating when the call started. Name associated with the caller ID. Number associated with the caller ID. Disposition of the content. Type of content. The value will be `text/swaig`. A unique identifier for the conversation thread, if configured via the AI parameters. The answer from the AI agent to the `post_prompt`. The object contains the three following fields. If a JSON object is detected within the answer, it is parsed and provided here. The raw data answer from the AI agent. The answer from the AI agent, excluding any JSON. ID of the Project. ID of the Space. A collection of variables related to SWML. A log related to SWAIG functions. Represents the total number of input tokens. Represents the total number of output tokens. Version number. ### Post prompt callback request example Below is a JSON example of the callback request that is sent to the `post_prompt_url`: ```json { "total_output_tokens": 119, "caller_id_name": "[CALLER_NAME]", "SWMLVars": { "ai_result": "success", "answer_result": "success" }, "call_start_date": 1694541295773508, "project_id": "[PROJECT_ID]", "call_log": [ { "content": "[AI INITIAL PROMPT/INSTRUCTIONS]", "role": "system" }, { "content": "[AI RESPONSE]", "role": "assistant" }, { "content": "[USER RESPONSE]", "role": "user" } ], "ai_start_date": 1694541297950440, "call_answer_date": 1694541296799504, "version": "2.0", "content_disposition": "Conversation Log", "conversation_id": "[CONVERSATION_ID]", "space_id": "[SPACE_ID]", "app_name": "swml app", "swaig_log": [ { "post_data": { "content_disposition": "SWAIG Function", "conversation_id": "[CONVERSATION_ID]", "space_id": "[SPACE_ID]", "meta_data_token": "[META_DATA_TOKEN]", "app_name": "swml app", "meta_data": {}, "argument": { "raw": "{\n \"target\": \"[TRANSFER_TARGET]\"\n}", "substituted": "", "parsed": [ { "target": "[TRANSFER_TARGET]" } ] }, "call_id": "[CALL_ID]", "content_type": "text/swaig", "ai_session_id": "[AI_SESSION_ID]", "caller_id_num": "[CALLER_NUMBER]", "caller_id_name": "[CALLER_NAME]", "project_id": "[PROJECT_ID]", "purpose": "Use to transfer to a target", "argument_desc": { "type": "object", "properties": { "target": { "description": "the target to transfer to", "type": "string" } } }, "function": "transfer", "version": "2.0" }, "command_name": "transfer", "epoch_time": 1694541334, "command_arg": "{\n \"target\": \"[TRANSFER_TARGET]\"\n}", "url": "https://example.com/here", "post_response": { "action": [ { "say": "This is a say message!" }, { "SWML": { "sections": { "main": [ { "connect": { "to": "+1XXXXXXXXXX" } } ] }, "version": "1.0.0" } }, { "stop": true } ], "response": "transferred to [TRANSFER_TARGET], the call has ended" } } ], "total_input_tokens": 5627, "caller_id_num": "[CALLER_NUMBER]", "call_id": "[CALL_ID]", "call_end_date": 1694541335435503, "content_type": "text/swaig", "action": "post_conversation", "post_prompt_data": { "substituted": "[SUMMARY_MESSAGE_PLACEHOLDER]", "parsed": [], "raw": "[SUMMARY_MESSAGE_PLACEHOLDER]" }, "ai_end_date": 1694541335425164, "ai_session_id": "[AI_SESSION_ID]" } ``` ### Responding to post prompt requests The response to the callback request should be a JSON object with the following parameters: ```json { "response": "ok" } ``` ## Examples ### Minimal AI agent ```yaml version: 1.0.0 sections: main: - answer: {} - ai: prompt: text: "You are a customer service agent. Answer questions about account status and billing." ``` ```json { "version": "1.0.0", "sections": { "main": [ { "answer": {} }, { "ai": { "prompt": { "text": "You are a customer service agent. Answer questions about account status and billing." } } } ] } } ``` ### Hints ```yaml ai: hints: - Tony - hint: swimmel pattern: swimmel replace: SWML ``` ```json { "ai": { "hints": [ "Tony", { "hint": "swimmel", "pattern": "swimmel", "replace": "SWML" } ] } } ``` ### Pronounce ```yaml version: 1.0.0 sections: main: - ai: prompt: text: | You are an expert in the GIF file format. Tell the user whatever they'd like to know in this field. pronounce: - replace: GIF with: jif ``` ```json { "version": "1.0.0", "sections": { "main": [ { "ai": { "prompt": { "text": "You are an expert in the GIF file format. Tell the user whatever they'd like to know in this\nfield.\n" }, "pronounce": [ { "replace": "GIF", "with": "jif" } ] } } ] } } ```