Discussions
FULL mode custom LLM — agent not calling base_url endpoint
I’ve configured FULL mode with a custom LLM via llm_configuration_id. My OpenAI-compatible endpoint works correctly (tested streaming and non-streaming). Sessions create successfully, the agent appears in the LiveKit room, and I see lk.agent.events and lk.transcription — but my endpoint receives zero HTTP requests.
LLM Configuration ID: 7899c80f-fab1-40c8-87f7-e57e5e1bc115
Avatar ID: 0930fd59-c8ad-434d-ad53-b391a1768720
base_url: [My Supabase Edge Function URL -- Happy to share privately]
Session creation body:
{
"mode": "FULL",
"avatar_id": "0930fd59-c8ad-434d-ad53-b391a1768720",
"avatar_persona": { "language": "en" },
"interactivity_type": "CONVERSATIONAL",
"llm_configuration_id": "7899c80f-fab1-40c8-87f7-e57e5e1bc115",
"is_sandbox": false
}
Is Custom LLM Integration in FULL mode fully supported? What am I missing?