***

title: set_post_prompt_llm_params
slug: /reference/python/agents/agent-base/set-post-prompt-llm-params
description: Set LLM parameters specifically for the post-prompt.
max-toc-depth: 3
---------------------

For a complete index of all SignalWire documentation pages, fetch https://signalwire.com/docs/llms.txt

[set-prompt-llm-params]: /docs/server-sdks/reference/python/agents/agent-base/set-prompt-llm-params

[ref-agentbase]: /docs/server-sdks/reference/python/agents/agent-base

Set LLM parameters specifically for the post-prompt. Accepts the same parameters as
[`set_prompt_llm_params()`][set-prompt-llm-params] except `barge_confidence`, which does not apply to post-prompts.

## **Parameters**

Same keyword arguments as [`set_prompt_llm_params()`][set-prompt-llm-params] (excluding `barge_confidence`).

## **Returns**

[`AgentBase`][ref-agentbase] -- Returns self for method chaining.

## **Example**

```python {6}
from signalwire import AgentBase

agent = AgentBase(name="support", route="/support")
agent.set_prompt_text("You are a helpful customer support agent.")
agent.set_post_prompt("Summarize this call as JSON with intent and resolution.")
agent.set_post_prompt_llm_params(
    model="gpt-4o-mini",
    temperature=0.3
)
agent.serve()
```