***

title: set_prompt_llm_params
slug: /reference/python/agents/agent-base/set-prompt-llm-params
description: Set LLM parameters specifically for the main prompt.
max-toc-depth: 3
---------------------

For a complete index of all SignalWire documentation pages, fetch https://signalwire.com/docs/llms.txt

[ref-agentbase]: /docs/server-sdks/reference/python/agents/agent-base

Set LLM parameters specifically for the main prompt. Parameters are passed through
to the SignalWire server and validated against the target model's capabilities.

## **Parameters**

Parameters are passed as keyword arguments. The SDK sends only the parameters you
explicitly set -- it does **not** inject defaults. The default values noted below are
**server-side defaults** applied by the SignalWire platform when a parameter is omitted.

Common options:

<ParamField path="model" type="str" toc={true}>
  AI model to use (e.g., `"gpt-4o-mini"`, `"gpt-4.1-mini"`, `"nova-micro"`, `"nova-lite"`).
</ParamField>

<ParamField path="temperature" type="float" toc={true}>
  Output randomness. Range: `0.0` -- `2.0`. Default: `0.3`. Lower values produce more
  deterministic responses.
</ParamField>

<ParamField path="top_p" type="float" toc={true}>
  Nucleus sampling threshold. Range: `0.0` -- `1.0`. Default: `1.0`. Alternative to temperature.
</ParamField>

<ParamField path="barge_confidence" type="float" toc={true}>
  ASR confidence threshold for barge-in. Higher values make it harder for callers to interrupt.
</ParamField>

<ParamField path="presence_penalty" type="float" toc={true}>
  Topic diversity. Range: `-2.0` -- `2.0`. Default: `0.1`. Positive values encourage new topics.
</ParamField>

<ParamField path="frequency_penalty" type="float" toc={true}>
  Repetition control. Range: `-2.0` -- `2.0`. Default: `0.1`. Positive values reduce repetition.
</ParamField>

## **Returns**

[`AgentBase`][ref-agentbase] -- Returns self for method chaining.

## **Example**

```python {5}
from signalwire import AgentBase

agent = AgentBase(name="support", route="/support")
agent.set_prompt_text("You are a helpful assistant.")
agent.set_prompt_llm_params(
    model="gpt-4.1-mini",
    temperature=0.7,
    top_p=0.9
)
agent.serve()
```