Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
309 changes: 309 additions & 0 deletions docs/guides/mcp.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,309 @@
---
title: HyperFrames MCP
description: "Author, preview, and render HyperFrames videos directly inside Claude.ai and ChatGPT — no local install required."
---

The HyperFrames MCP is a hosted [Model Context Protocol](https://modelcontextprotocol.io) server that lets you create, edit, preview, and render HyperFrames video compositions from inside Claude.ai or ChatGPT.

<Note>
**In beta.** Features and pricing may change. Found a bug or have feedback? [File an issue on GitHub](https://github.com/heygen-com/hyperframes/issues).
</Note>

## What you can do

- Create compositions from natural-language prompts
- Edit existing compositions conversationally — *"make the title 2x bigger"*, *"add hype-style captions"*
- Preview the result inline in the chat with a video player widget
- Render to `mp4`, `webm`, or `mov` — output URL streamed back to chat
- Revisit compositions you've created previously
- Check your render credits

The agent behind `compose` has 25+ HyperFrames-specific skills baked in — typography, color palettes, motion principles, GSAP effects, audio-reactive animation, captions, voice generation. You don't have to specify any of this directly; describe the video you want and the agent picks the right tools.

<Note>
Looking for the open-source CLI? See the [Quickstart](/quickstart). The MCP is a hosted product for zero-install authoring inside an LLM chat. The CLI gives you full control of rendering and runtime; the MCP gives you instant authoring with cloud rendering.
</Note>

## Setup

### 1. Get a HeyGen account

The MCP requires a HeyGen account for authentication and credits. Sign up at [heygen.com](https://heygen.com) if you don't have one.

### 2. Add the connector

<Tabs>
<Tab title="Claude.ai">
<Steps>
<Step title="Open Settings → Connectors">
In Claude.ai web or desktop: **Settings → Connectors → Add custom connector**.
</Step>
<Step title="Enter the URL">
Paste:
```
https://mcp.heygen.com/mcp/hyperframes
```
</Step>
<Step title="Sign in to HeyGen">
OAuth opens in a new window. Authorize the HyperFrames connector to access your HeyGen account.
</Step>
<Step title="Start a new chat">
Open a new Claude.ai chat. Try:

> Make me a 10-second product intro for [your product] with bouncy captions and a high-energy soundtrack.
</Step>
</Steps>
</Tab>
<Tab title="ChatGPT">
<Steps>
<Step title="Open Apps & Connectors">
In ChatGPT: **Settings → Apps & Connectors → Add MCP server**.
</Step>
<Step title="Enter the URL">
Paste:
```
https://mcp.heygen.com/mcp/hyperframes
```
</Step>
<Step title="Sign in to HeyGen">
Authorize via OAuth.
</Step>
<Step title="Start a new chat">
Open a new chat. Same prompts work as in Claude.ai.
</Step>
</Steps>
</Tab>
</Tabs>

## Available tools

The MCP exposes six tools to the LLM. You don't call them directly — the model picks the right tool based on your message.

| Tool | What it does | Cost |
|---|---|---|
| `compose` | Create a new composition or edit an existing one | Author credits |
| `list_compositions` | List your previously created compositions | Free |
| `get_composition` | Open a specific composition with an inline player | Free |
| `render_video` | Submit a cloud render to mp4 / webm / mov | Render credits |
| `get_render_status` | Poll a long-running render job | Free |
| `get_credits` | Check your remaining credits and tier | Free |

### compose

Authors a new composition or applies an edit to an existing one. The HyperFrames agent handles voice selection, captions, blocks, layout, transitions, color, and timing internally based on your natural-language prompt.

**Triggered by prompts like:**

- "Make a 30-second product intro about [topic]" → creates a fresh composition
- "Change the title font to a bold serif" → edits the most recent composition
- "Add a flash transition before the call-to-action" → applies a structured edit

**Returns:** A composition reference (id, title, thumbnail) plus an inline player widget. Progress notifications stream during the run so you can see what the agent is doing — *"Drafting outline...", "Selecting voice and style...", "Generating HTML...", "Rendering preview frame..."*

### list_compositions

Lists compositions you've previously created. Newest first, paginated.

**Triggered by prompts like:** *"show me my recent videos"*, *"what did I work on yesterday?"*

### get_composition

Fetches metadata for a single composition along with an inline player widget.

**Triggered by prompts like:** *"open that video again"*, *"show me the one I made about [topic]"*

### render_video

Submits a cloud render. Defaults to `mp4` at `30fps`.

**Format options:**

| Format | Codec | Use it for |
|---|---|---|
| `mp4` (default) | H.264 | Broadest compatibility, social media, web |
| `webm` | VP9 | Smaller files; supports alpha channel for transparent overlays |
| `mov` | ProRes | Lossless quality for editing pipelines |

**Frame rate options:** `24`, `30` (default), `60`.

**Returns:** Either the rendered video URL (if the render finishes within 25 seconds) or a `job_id` for polling. Either way, an inline render-progress widget shows live status.

**Triggered by prompts like:** *"render this"*, *"export to webm"*, *"render at 60fps for editing"*.

### get_render_status

Polls an in-progress render. Used internally by the model when `render_video` returns a `job_id` (long renders).

### get_credits

Returns your tier and remaining credits.

**Triggered by:** *"how many renders do I have left?"*, *"what's my plan?"*

## Prompting tips

### Be specific about what you want

The agent has lots of creative latitude — give it enough direction to use it well.

| Less effective | More effective |
|---|---|
| "make a video" | "make a 15-second TikTok hook about home composting with bouncy captions and a warm earthy palette" |
| "add captions" | "add hype-style captions in my brand color #FF6A00" |
| "make it shorter" | "trim to 10 seconds total — cut the third scene" |
| "more energetic" | "swap to a neon-electric palette and tighten all transitions to 200ms" |

### Iterate conversationally

Once a composition exists, the agent loads the current state and applies edits in place. Keep talking to it.

```
You: "20-second product intro for my app, dark theme, hype style"
Agent: [composition + player widget appears]

You: "make the logo bigger and add a pulse animation on the CTA"
Agent: [updated player widget]

You: "render to webm with alpha"
Agent: [render-progress widget, then player widget with download link]
```

### Reference your existing HeyGen assets

If you've uploaded logos, brand voices, fonts, or other assets to your HeyGen account, the agent can use them. Just say *"use my logo"* or *"use my Sarah brand voice."* The agent resolves the asset by name and recency.

### Pick the right format up front

Mention the output format if you have a specific use case:

- *"render to mp4"* — default, social media
- *"render to webm with alpha"* — transparent overlay you'll composite later
- *"render to mov for After Effects"* — ProRes for editing

## Debugging

### Inspect the MCP with MCP Inspector

For developers building integrations or debugging tool responses, the MCP Inspector lets you see exactly what tools are exposed and what they return:

```bash
npx @modelcontextprotocol/inspector npx -y mcp-remote https://mcp.heygen.com/mcp/hyperframes
```

Complete the OAuth flow in the inspector's browser tab. Once authenticated, you can call each tool with custom parameters and see raw responses, including `tool_data` and `widget_data`.

### Watch progress notifications

The MCP emits MCP `notifications/progress` events during long-running `compose` and `render_video` calls. The host (Claude.ai or ChatGPT) displays them inline:

```
You: "make me a 30-second product intro"
Agent: [calls compose]
↳ "Drafting outline..." ← progress notification
↳ "Selecting voice and style..." ← progress notification
↳ "Generating HTML..." ← progress notification
↳ "Rendering preview frame..." ← progress notification
↳ [composition + player widget]
Agent: "Here's your video — [player]"
```

If progress stops mid-flow, the run failed. The agent's next message should explain what went wrong.

### Common issues

<AccordionGroup>
<Accordion title="OAuth keeps failing or loops">
Verify you're using the production URL: `https://mcp.heygen.com/mcp/hyperframes`. The dev URL (`mcp.dev.heygen.com`) only accepts dev accounts.

If OAuth completes but you see "not authorized" errors, your HeyGen account may not have access to the MCP — contact support or check your tier.
</Accordion>

<Accordion title="Render is stuck or never completes">
Renders normally finish in 10-90 seconds depending on length, fps, and format.

If `get_render_status` shows `status: rendering` for more than 5 minutes, something has stuck. Try:

1. Start a new chat thread
2. Run `compose("regenerate this composition")` — the underlying composition may reference an asset that's failing to load
3. If the issue persists, file an issue at [github.com/heygen-com/hyperframes/issues](https://github.com/heygen-com/hyperframes/issues) including the `job_id` from `render_video`
</Accordion>

<Accordion title='"Composition not found" error'>
The `composition_id` is owned by the HeyGen space (account) that created it. If you're signed into a different space, you can't access compositions from another. Run `list_compositions` to see what's available from your current account.
</Accordion>

<Accordion title="Player widget shows blank or loads forever">
Usually a transient connection issue between the widget and `mcp.heygen.com`. Refresh the chat or call `get_composition` again.

If it persists, your composition may reference a media asset that failed to upload — recreate the composition with `compose("regenerate this")`.
</Accordion>

<Accordion title="Out of credits">
The MCP returns an error with an upgrade URL when your credits are exhausted. Visit [heygen.com/pricing](https://heygen.com/pricing) to upgrade your tier.
</Accordion>

<Accordion title="Tool call timed out in the host">
`compose` runs can take 30+ seconds for complex prompts. If your client times out before the response comes back, the underlying run is likely still progressing — wait 30 seconds and run `list_compositions`. The composition is probably already created.
</Accordion>

<Accordion title="Agent ignored part of my prompt">
The agent prefers structural decisions (palette, layout, motion) over fine-grained pixel positioning. If a specific edit doesn't land, try rephrasing more directively:

- Less effective: *"the title is a little off"*
- More effective: *"move the title 40px down and increase its weight to 800"*

For pixel-precise control, use the [open-source CLI](/quickstart) — the MCP is optimized for fast natural-language iteration.
</Accordion>
</AccordionGroup>

### Reporting issues

For bugs or feature requests, file an issue at [github.com/heygen-com/hyperframes/issues](https://github.com/heygen-com/hyperframes/issues). Include:

- The prompt you used
- The `composition_id` and `job_id` (if applicable) — these are visible in the tool response details
- A description of what you expected vs. what happened
- The host (Claude.ai web/desktop, ChatGPT, etc.)

## Limitations

- **Cloud-only rendering.** All renders run on HeyGen infrastructure. Use the [CLI](/quickstart) if you need local rendering.
- **Single-user.** Each composition is owned by one HeyGen account. No team sharing in v1.
- **No binary uploads from chat.** You can reference assets you've already uploaded to HeyGen via the web UI, but the MCP does not currently accept new file uploads through chat. Upload via [app.heygen.com](https://app.heygen.com) first, then reference the asset by name.
- **Aspect ratios:** `16:9`, `9:16`, `1:1`, `4:5`. Other ratios fall back to the closest match.
- **No fine-grained editing tools.** Edits go through the agent. For pixel-precise control, use the [CLI](/quickstart) and a coding agent like Claude Code or Cursor.
- **Widget rendering requires a host that supports MCP widgets.** Claude.ai web/desktop and ChatGPT (Apps SDK) support widgets today. Text-only MCP clients (Claude Code CLI, Cursor, Windsurf) will see a clickable preview URL instead — full text-mode support is on the roadmap.

## How this relates to the open-source framework

[HyperFrames itself is open source](https://github.com/heygen-com/hyperframes) — the HTML composition format, CLI, renderer, and player. You can use HyperFrames locally without the MCP.

The MCP is a HeyGen-hosted product that wraps:

- The HyperFrames composition agent (the LLM that authors compositions)
- HeyGen's cloud rendering pipeline
- HeyGen's voice / TTS / asset libraries
- OAuth, credits, and tier management

| You want… | Use… |
|---|---|
| Zero local install, fast natural-language authoring | The MCP |
| Pixel-precise control, custom rendering, self-hosting | The [CLI](/quickstart) |
| Both | Author with the MCP, then download and refine with the CLI (planned export feature) |

## Next steps

<CardGroup cols={2}>
<Card title="Quickstart" href="/quickstart">
Try HyperFrames locally with the open-source CLI.
</Card>
<Card title="Prompting guide" href="/guides/prompting">
Tips for getting the best results when working with AI agents.
</Card>
<Card title="Catalog" href="/catalog/blocks/data-chart">
Browse 50+ ready-to-use blocks the agent draws from.
</Card>
<Card title="Examples" href="/examples">
Reference compositions you can clone.
</Card>
</CardGroup>
Loading