***

title: setPostPromptLlmParams
slug: /reference/typescript/agents/agent-base/set-post-prompt-llm-params
description: Set LLM parameters specifically for the post-prompt.
max-toc-depth: 3
---------------------

For a complete index of all SignalWire documentation pages, fetch https://signalwire.com/docs/llms.txt

[set-prompt-llm-params]: /docs/server-sdks/reference/typescript/agents/agent-base/set-prompt-llm-params

[ref-agentbase]: /docs/server-sdks/reference/typescript/agents/agent-base

Merge LLM-specific parameters into the post-prompt configuration. Accepts the same
parameters as [`setPromptLlmParams()`][set-prompt-llm-params].

## **Parameters**

<ParamField path="params" type={"Record<string, unknown>"} required={true} toc={true}>
  Key-value LLM parameters to merge.
</ParamField>

## **Returns**

[`AgentBase`][ref-agentbase] -- Returns `this` for method chaining.

## **Example**

```typescript {6}
import { AgentBase } from '@signalwire/sdk';

const agent = new AgentBase({ name: 'support', route: '/support' });
agent.setPromptText('You are a helpful customer support agent.');
agent.setPostPrompt('Summarize this call as JSON with intent and resolution.');
agent.setPostPromptLlmParams({
  model: 'gpt-4o-mini',
  temperature: 0.3,
});
await agent.serve();
```