***

title: setPromptLlmParams
slug: /reference/typescript/agents/agent-base/set-prompt-llm-params
description: Set LLM parameters specifically for the main prompt.
max-toc-depth: 3
---------------------

For a complete index of all SignalWire documentation pages, fetch https://signalwire.com/docs/llms.txt

[ref-agentbase]: /docs/server-sdks/reference/typescript/agents/agent-base

Merge LLM-specific parameters into the main prompt configuration (e.g., model,
temperature). These are merged into the `ai.prompt` object in the SWML output.

## **Parameters**

<ParamField path="params" type={"Record<string, unknown>"} required={true} toc={true}>
  Key-value LLM parameters to merge. Common options include `model`, `temperature`,
  `top_p`, `barge_confidence`, `presence_penalty`, and `frequency_penalty`.
</ParamField>

## **Returns**

[`AgentBase`][ref-agentbase] -- Returns `this` for method chaining.

## **Example**

```typescript {5}
import { AgentBase } from '@signalwire/sdk';

const agent = new AgentBase({ name: 'support', route: '/support' });
agent.setPromptText('You are a helpful assistant.');
agent.setPromptLlmParams({
  model: 'gpt-4.1-mini',
  temperature: 0.7,
  top_p: 0.9,
});
await agent.serve();
```