set_prompt_llm_params
set_prompt_llm_params
Set LLM parameters specifically for the main prompt. Parameters are passed through to the SignalWire server and validated against the target model’s capabilities.
Parameters
Parameters are passed as keyword arguments. The SDK sends only the parameters you explicitly set — it does not inject defaults. The default values noted below are server-side defaults applied by the SignalWire platform when a parameter is omitted.
Common options:
model
AI model to use (e.g., "gpt-4o-mini", "gpt-4.1-mini", "nova-micro", "nova-lite").
temperature
Output randomness. Range: 0.0 — 2.0. Default: 0.3. Lower values produce more
deterministic responses.
top_p
Nucleus sampling threshold. Range: 0.0 — 1.0. Default: 1.0. Alternative to temperature.
barge_confidence
ASR confidence threshold for barge-in. Higher values make it harder for callers to interrupt.
presence_penalty
Topic diversity. Range: -2.0 — 2.0. Default: 0.1. Positive values encourage new topics.
frequency_penalty
Repetition control. Range: -2.0 — 2.0. Default: 0.1. Positive values reduce repetition.
Returns
AgentBase — Returns self for method chaining.