setPromptLlmParams
Merge LLM-specific parameters into the main prompt configuration (e.g., model,
temperature). These are merged into the ai.prompt object in the SWML output.
Parameters
params
Key-value LLM parameters to merge. Common options include model, temperature,
top_p, barge_confidence, presence_penalty, and frequency_penalty.
Returns
AgentBase — Returns this for method chaining.