AgentsAgentBase

set_prompt_llm_params

View as MarkdownOpen in Claude

Set LLM parameters specifically for the main prompt. Parameters are passed through to the SignalWire server and validated against the target model’s capabilities.

Parameters

Parameters are passed as keyword arguments. The SDK sends only the parameters you explicitly set — it does not inject defaults. The default values noted below are server-side defaults applied by the SignalWire platform when a parameter is omitted.

Common options:

model
str

AI model to use (e.g., "gpt-4o-mini", "gpt-4.1-mini", "nova-micro", "nova-lite").

temperature
float

Output randomness. Range: 0.02.0. Default: 0.3. Lower values produce more deterministic responses.

top_p
float

Nucleus sampling threshold. Range: 0.01.0. Default: 1.0. Alternative to temperature.

barge_confidence
float

ASR confidence threshold for barge-in. Higher values make it harder for callers to interrupt.

presence_penalty
float

Topic diversity. Range: -2.02.0. Default: 0.1. Positive values encourage new topics.

frequency_penalty
float

Repetition control. Range: -2.02.0. Default: 0.1. Positive values reduce repetition.

Returns

AgentBase — Returns self for method chaining.

Example

1from signalwire import AgentBase
2
3agent = AgentBase(name="support", route="/support")
4agent.set_prompt_text("You are a helpful assistant.")
5agent.set_prompt_llm_params(
6 model="gpt-4.1-mini",
7 temperature=0.7,
8 top_p=0.9
9)
10agent.serve()