LangChain Orchestrates Logic. It Does Not Handle Phone Calls.
Latency compounds at every hop
HTTP-based voice bridging adds 2 to 4 seconds of overhead on top of your LLM latency. Developers report 6 to 7 second round trips. At two seconds, callers get impatient. At six, they hang up.
Telephony is a separate engineering problem
SIP signaling, codec transcoding, jitter buffers, echo cancellation, and NAT traversal have nothing to do with your agent's reasoning graph. They require domain expertise your team should not need.
Protocol mismatch blocks deployment
Your agent speaks WebSocket. Phones speak SIP. Bridging them without middleware latency requires infrastructure purpose-built for real-time voice, not a general-purpose WebRTC adapter.
Scaling voice is not scaling HTTP
Concurrent voice sessions require media processing, carrier-grade SIP handling, and geographic distribution. Your agent scales horizontally. The voice layer needs telephony infrastructure underneath.
Build a Voice AI Agent
fromsignalwire_agentsimportAgentBasefromsignalwire_agents.core.function_resultimportSwaigFunctionResultclassSupportAgent(AgentBase):def__init__(self):super().__init__(name="Support Agent",route="/support")self.prompt_add_section("Instructions",body="You are a customer support agent. ""Greet the caller and resolve their issue.")self.add_language("English","en-US","rime.spore:mistv2")@AgentBase.tool(name="check_order")defcheck_order(self,order_id:str):"""Check the status of a customer order. Args: order_id: The order ID to look up """returnSwaigFunctionResult(f"Order {order_id}: shipped, ETA April 2nd")agent=SupportAgent()agent.run()
import{AgentBase,FunctionResult}from'@signalwire/sdk';constagent=newAgentBase({name:'Support Agent',route:'/support',});agent.promptAddSection('Instructions','You are a customer support agent. Greet the caller and resolve their issue.');agent.addLanguage({name:'English',code:'en-US',voice:'rime.spore:mistv2'});agent.defineTool({name:'check_order',description:'Check the status of a customer order',parameters:{type:'object',properties:{order_id:{type:'string',description:'The order ID to look up'},},required:['order_id'],},handler:(args)=>{returnnewFunctionResult(`Order ${args.order_id}: shipped, ETA April 2nd`);},});agent.run();
packagemainimport("fmt""github.com/signalwire/signalwire-go/pkg/agent""github.com/signalwire/signalwire-go/pkg/swaig")funcmain(){a:=agent.NewAgentBase(agent.WithName("Support Agent"),agent.WithRoute("/support"),)a.PromptAddSection("Instructions","You are a customer support agent. Greet the caller and resolve their issue.")a.AddLanguage(map[string]any{"name":"English","code":"en-US","voice":"rime.spore:mistv2",})a.DefineTool(agent.ToolDefinition{Name:"check_order",Description:"Check the status of a customer order",Parameters:map[string]any{"type":"object","properties":map[string]any{"order_id":map[string]any{"type":"string","description":"The order ID to look up",},},"required":[]string{"order_id"},},Handler:func(argsmap[string]any,rawDatamap[string]any)*swaig.FunctionResult{orderID:=args["order_id"]returnswaig.NewFunctionResult(fmt.Sprintf("Order %v: shipped, ETA April 2nd",orderID),)},})a.Run()}
importcom.signalwire.sdk.agent.AgentBase;importcom.signalwire.sdk.swaig.FunctionResult;importjava.util.List;importjava.util.Map;publicclassSupportAgent{publicstaticvoidmain(String[]args)throwsException{varagent=AgentBase.builder().name("Support Agent").route("/support").build();agent.promptAddSection("Instructions","You are a customer support agent. "+"Greet the caller and resolve their issue.");agent.addLanguage("English","en-US","rime.spore:mistv2");agent.defineTool("check_order","Check the status of a customer order",Map.of("type","object","properties",Map.of("order_id",Map.of("type","string","description","The order ID to look up")),"required",List.of("order_id")),(toolArgs,rawData)->{varorderId=toolArgs.get("order_id");returnnewFunctionResult("Order "+orderId+": shipped, ETA April 2nd");});agent.run();}}
# frozen_string_literal: truerequire'signalwire'agent=SignalWire::AgentBase.new(name:'Support Agent',route:'/support')agent.prompt_add_section('Instructions','You are a customer support agent. Greet the caller and resolve their issue.')agent.add_language(name:'English',code:'en-US',voice:'rime.spore:mistv2')agent.define_tool(name:'check_order',description:'Check the status of a customer order',parameters:{'order_id'=>{'type'=>'string','description'=>'The order ID to look up'}})do|args,_raw|SignalWire::Swaig::FunctionResult.new("Order #{args['order_id']}: shipped, ETA April 2nd")endagent.run
<?phprequire'vendor/autoload.php';useSignalWire\Agent\AgentBase;useSignalWire\SWAIG\FunctionResult;$agent=newAgentBase(['name'=>'Support Agent','route'=>'/support']);$agent->promptAddSection('Instructions','You are a customer support agent. Greet the caller and resolve their issue.');$agent->addLanguage('English','en-US','rime.spore:mistv2');$agent->defineTool(name:'check_order',description:'Check the status of a customer order',parameters:['order_id'=>['type'=>'string','description'=>'The order ID to look up'],],handler:function(array$args):FunctionResult{returnnewFunctionResult("Order {$args['order_id']}: shipped, ETA April 2nd");});$agent->run();
#!/usr/bin/env perlusestrict;usewarnings;uselib'lib';useSignalWire::Agent::AgentBase;useSignalWire::SWAIG::FunctionResult;my$agent=SignalWire::Agent::AgentBase->new(name=>'Support Agent',route=>'/support',);$agent->prompt_add_section('Instructions','You are a customer support agent. Greet the caller and resolve their issue.');$agent->add_language(name=>'English',code=>'en-US',voice=>'rime.spore:mistv2');$agent->define_tool(name=>'check_order',description=>'Check the status of a customer order',parameters=>{order_id=>{type=>'string',description=>'The order ID to look up'},},handler=>sub{my($args,$raw)=@_;returnSignalWire::SWAIG::FunctionResult->new(response=>"Order $args->{order_id}: shipped, ETA April 2nd");},);$agent->run;
#include<signalwire/agent/agent_base.hpp>usingnamespacesignalwire;usingjson=nlohmann::json;classSupportAgent:publicagent::AgentBase{public:SupportAgent():AgentBase("Support Agent","/support"){prompt_add_section("Instructions","You are a customer support agent. ""Greet the caller and resolve their issue.");add_language({"English","en-US","rime.spore:mistv2"});define_tool({.name="check_order",.description="Check the status of a customer order",.parameters={{"order_id",{{"type","string"},{"description","The order ID to look up"}}}},.handler=[](constjson&args,constjson&){autoorder_id=args.value("order_id","unknown");returnswaig::FunctionResult("Order "+order_id+": shipped, ETA April 2nd");}});}};intmain(){SupportAgent().run();}
usingSignalWire.Agent;usingSignalWire.SWAIG;varagent=newAgentBase(newAgentOptions{Name="Support Agent",Route="/support"});agent.PromptAddSection("Instructions","You are a customer support agent. Greet the caller and resolve their issue.");agent.AddLanguage("English","en-US","rime.spore:mistv2");agent.DefineTool("check_order","Check the status of a customer order",new{type="object",properties=new{order_id=new{type="string",description="The order ID to look up"}},required=new[]{"order_id"}},(args,rawData)=>{varorderId=args.TryGetValue("order_id",outvarid)?id:"unknown";returnnewFunctionResult($"Order {orderId}: shipped, ETA April 2nd");});agent.Run();
usesignalwire::agent::AgentBase;usesignalwire::swaig::FunctionResult;useserde_json::json;fnmain(){letmutagent=AgentBase::builder().name("Support Agent").route("/support").build();agent.prompt_add_section("Instructions","You are a customer support agent. Greet the caller and resolve their issue.",&[]).add_language("English","en-US","rime.spore:mistv2");agent.define_tool("check_order","Check the status of a customer order",json!({"type":"object","properties":{"order_id":{"type":"string","description":"The order ID to look up"}},"required":["order_id"]}),Box::new(|args,_raw|{letorder_id=args.get("order_id").and_then(|v|v.as_str()).unwrap_or("unknown");FunctionResult::with_response(&format!("Order {order_id}: shipped, ETA April 2nd"))}),);agent.run();}
What Stays. What Changes.
You Keep (unchanged)
Your LangGraph state graph and reasoning chains
Your tool definitions and implementations
Your RAG retrievers and vector stores
Your prompt templates and conversation memory
Your LLM choice (OpenAI, Anthropic, self-hosted)
You Add (via SignalWire)
A YAML document defining the telephony wrapper
Tool-calling endpoints on your existing server
A phone number pointed at your YAML document
Latency: DIY Voice Bridge vs. SignalWire
Component
HTTP Voice Bridge
SignalWire
Audio capture to STT
200 to 800ms (middleware hop)
Co-located, sub-100ms
LLM processing
Your LLM latency (same)
Your LLM latency (same)
TTS to audio playback
200 to 500ms (middleware hop)
Co-located, sub-100ms
Total overhead (excl. LLM)
2 to 4 seconds
Under 1200ms median
Caller experience at 6 to 7s
Hang up
Conversational
From Chat Agent to Phone Agent in an Afternoon
1
Wrap your tool handlers
Expose your existing LangChain tool-calling handlers as webhook endpoints. The request/response format matches standard function calling.
2
Write a YAML document
Define the call flow, available functions, and prompt. Your agent's sophistication lives in your handlers, not in YAML.
3
Get a phone number
Provision a number in your SignalWire dashboard and point it at your YAML document.
4
Call it
Your LangGraph agent answers a real phone call. The reasoning graph, retrievers, and tools all work. The hard part was building the agent, and you already did that.
SignalWire was founded by the engineers who wrote FreeSWITCH, the open-source telecom engine used by carriers and contact centers for 20 years. No middleware hops. No WebRTC bridges adding latency.
FAQ
Does this replace LangChain?
No. Your LangChain agent, graph, tools, retrievers, and LLM choice stay exactly where they are. SignalWire provides the telephony layer underneath. Two systems, each doing what it was built for.
What about the latency from my LLM?
SignalWire eliminates telephony overhead (codec transcoding, SIP signaling, media transport). Your LLM latency is your LLM latency. The total response time is your LLM time plus 800-1200ms infrastructure time. With speech-to-speech voice models, infrastructure time can be as low as 600ms.
Can my agent still use its existing tools over voice?
Yes. Tool calls work identically. When the LLM decides to invoke a tool, the request goes to your handler. Your handler runs whatever logic it runs today (RAG, database, API calls) and returns the result.
What does it cost?
Voice AI processing starts at $0.16 per minute. Phone numbers and SIP trunking at carrier rates. No per-tool surcharges.
Trusted by 2,000+ companies
The Phone Layer Is Infrastructure, Not a Research Project.
Your LangChain agent works over text. Making it work over the phone takes an afternoon, not a quarter.