Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions fern/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1263,6 +1263,12 @@ navigation:
- page: Understanding Input and Output Tokens for LLM Gateway
path: pages/faq/speech-understanding/understanding-input-and-output-tokens-in-ai-models.mdx
slug: /understanding-input-and-output-tokens-in-ai-models
- section: LLM Gateway
skip-slug: true
contents:
- page: How do I switch from using LeMUR to using LLM Gateway?
path: pages/faq/llm-gateway/how-do-i-switch-from-lemur-to-llm-gateway.mdx
slug: /how-do-i-switch-from-lemur-to-llm-gateway
- section: Playground
skip-slug: true
contents:
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
title: "How do I switch from using LeMUR to using LLM Gateway?"
---

To switch from LeMUR to LLM Gateway, you need to make three key changes: include the transcript text directly in your request instead of passing transcript IDs, update the model name format, and modify your response handling to use the OpenAI-compatible format.

With LeMUR, you passed `transcript_ids` and the system automatically fetched the transcript content. With LLM Gateway, you first retrieve the transcript text and include it in your message content when calling the `/v1/chat/completions` endpoint at `llm-gateway.assemblyai.com`.

For a detailed step-by-step walkthrough with code examples in Python and JavaScript, see the [Migration Guide: From LeMUR to LLM Gateway](/docs/llm-gateway/migration-from-lemur).
Loading