setPromptLlmParams

View as MarkdownOpen in Claude

Merge LLM-specific parameters into the main prompt configuration (e.g., model, temperature). These are merged into the ai.prompt object in the SWML output.

Parameters

params
Record<string, unknown>Required

Key-value LLM parameters to merge. Common options include model, temperature, top_p, barge_confidence, presence_penalty, and frequency_penalty.

Returns

AgentBase — Returns this for method chaining.

Example

1import { AgentBase } from '@signalwire/sdk';
2
3const agent = new AgentBase({ name: 'support', route: '/support' });
4agent.setPromptText('You are a helpful assistant.');
5agent.setPromptLlmParams({
6 model: 'gpt-4.1-mini',
7 temperature: 0.7,
8 top_p: 0.9,
9});
10await agent.serve();