diff --git a/fern/docs.yml b/fern/docs.yml index fbab6d7d..ab5e2e39 100644 --- a/fern/docs.yml +++ b/fern/docs.yml @@ -1263,6 +1263,12 @@ navigation: - page: Understanding Input and Output Tokens for LLM Gateway path: pages/faq/speech-understanding/understanding-input-and-output-tokens-in-ai-models.mdx slug: /understanding-input-and-output-tokens-in-ai-models + - section: LLM Gateway + skip-slug: true + contents: + - page: How do I switch from using LeMUR to using LLM Gateway? + path: pages/faq/llm-gateway/how-do-i-switch-from-lemur-to-llm-gateway.mdx + slug: /how-do-i-switch-from-lemur-to-llm-gateway - section: Playground skip-slug: true contents: diff --git a/fern/pages/faq/llm-gateway/how-do-i-switch-from-lemur-to-llm-gateway.mdx b/fern/pages/faq/llm-gateway/how-do-i-switch-from-lemur-to-llm-gateway.mdx new file mode 100644 index 00000000..134914f8 --- /dev/null +++ b/fern/pages/faq/llm-gateway/how-do-i-switch-from-lemur-to-llm-gateway.mdx @@ -0,0 +1,9 @@ +--- +title: "How do I switch from using LeMUR to using LLM Gateway?" +--- + +To switch from LeMUR to LLM Gateway, you need to make three key changes: include the transcript text directly in your request instead of passing transcript IDs, update the model name format, and modify your response handling to use the OpenAI-compatible format. + +With LeMUR, you passed `transcript_ids` and the system automatically fetched the transcript content. With LLM Gateway, you first retrieve the transcript text and include it in your message content when calling the `/v1/chat/completions` endpoint at `llm-gateway.assemblyai.com`. + +For a detailed step-by-step walkthrough with code examples in Python and JavaScript, see the [Migration Guide: From LeMUR to LLM Gateway](/docs/llm-gateway/migration-from-lemur).