Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion agent-schema.json
Original file line number Diff line number Diff line change
Expand Up @@ -634,7 +634,7 @@
},
"provider_opts": {
"type": "object",
"description": "Provider-specific options. Sampling parameters: top_k (integer, supported by anthropic, google, amazon-bedrock, and custom OpenAI-compatible providers like vLLM/Ollama), repetition_penalty (float, forwarded to custom OpenAI-compatible providers), min_p (float, forwarded to custom providers), seed (integer, forwarded to OpenAI). Infrastructure options: dmr: runtime_flags. anthropic/amazon-bedrock (Claude): interleaved_thinking (boolean, default true), thinking_display ('summarized', 'omitted', or 'display') controls whether thinking blocks are returned in responses when thinking is enabled. Claude Opus 4.7 hides thinking by default ('omitted'); set thinking_display: summarized (or thinking_display: display) to receive thinking blocks. openai: transport ('sse' or 'websocket') to choose between SSE and WebSocket streaming for the Responses API. openai/anthropic/google: rerank_prompt (string) to fully override the system prompt used for RAG reranking (advanced - prefer using results.reranking.criteria for domain-specific guidance). Google: google_search (boolean) enables Google Search grounding, google_maps (boolean) enables Google Maps grounding, code_execution (boolean) enables server-side code execution.",
"description": "Provider-specific options. Sampling parameters: top_k (integer, supported by anthropic, google, amazon-bedrock, and custom OpenAI-compatible providers like vLLM/Ollama), repetition_penalty (float, forwarded to custom OpenAI-compatible providers), min_p (float, forwarded to custom providers), seed (integer, forwarded to OpenAI). Infrastructure options: http_headers (map of string to string, adds custom HTTP headers to every request; used for OpenAI-compatible providers like github-copilot which requires Copilot-Integration-Id). dmr: runtime_flags. anthropic/amazon-bedrock (Claude): interleaved_thinking (boolean, default true), thinking_display ('summarized', 'omitted', or 'display') controls whether thinking blocks are returned in responses when thinking is enabled. Claude Opus 4.7 hides thinking by default ('omitted'); set thinking_display: summarized (or thinking_display: display) to receive thinking blocks. openai: transport ('sse' or 'websocket') to choose between SSE and WebSocket streaming for the Responses API. openai/anthropic/google: rerank_prompt (string) to fully override the system prompt used for RAG reranking (advanced - prefer using results.reranking.criteria for domain-specific guidance). Google: google_search (boolean) enables Google Search grounding, google_maps (boolean) enables Google Maps grounding, code_execution (boolean) enables server-side code execution.",
"additionalProperties": true
},
"track_usage": {
Expand Down
2 changes: 2 additions & 0 deletions docs/_data/nav.yml
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,8 @@
url: /providers/nebius/
- title: MiniMax
url: /providers/minimax/
- title: GitHub Copilot
url: /providers/github-copilot/
- title: Local Models
url: /providers/local/
- title: Provider Definitions
Expand Down
23 changes: 23 additions & 0 deletions docs/configuration/models/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -194,6 +194,29 @@ models:

See the [Anthropic provider page](/providers/anthropic/#thinking-display) for details.

## Custom HTTP Headers

For OpenAI-compatible providers (`openai`, `github-copilot`, `mistral`, `xai`,
`nebius`, `minimax`, `ollama`, and any custom provider using the OpenAI API),
`provider_opts.http_headers` adds arbitrary HTTP headers to every outgoing
request:

```yaml
models:
my_model:
provider: openai
model: gpt-4o
provider_opts:
http_headers:
X-Request-Source: docker-agent
X-Tenant-Id: my-team
```

Header names are matched case-insensitively. The `github-copilot` provider
automatically sets `Copilot-Integration-Id: vscode-chat` — see the
[GitHub Copilot provider page]({{ '/providers/github-copilot/' | relative_url }})
for details.

## Examples by Provider

```yaml
Expand Down
122 changes: 122 additions & 0 deletions docs/providers/github-copilot/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
---
title: "GitHub Copilot"
description: "Use GitHub Copilot's hosted models (GPT-4o, Claude, Gemini, and more) with docker-agent through your GitHub subscription."
permalink: /providers/github-copilot/
---

# GitHub Copilot

_Use GitHub Copilot's hosted models with docker-agent through your existing GitHub subscription._

## Overview

GitHub Copilot exposes an OpenAI-compatible Chat Completions API at
`https://api.githubcopilot.com`. docker-agent ships with built-in support for
it as the `github-copilot` provider, so any user with a paid GitHub Copilot
subscription can reuse their entitlement from docker-agent.

## Prerequisites

- An active **GitHub Copilot** subscription (Individual, Business, or Enterprise).
- A **personal access token** with the `copilot` scope, exported as `GITHUB_TOKEN`.

```bash
export GITHUB_TOKEN="ghp_..."
```

## Configuration

### Inline

```yaml
agents:
root:
model: github-copilot/gpt-4o
instruction: You are a helpful assistant.
```

### Named model

```yaml
models:
copilot:
provider: github-copilot
model: gpt-4o
temperature: 0.7
max_tokens: 4000

agents:
root:
model: copilot
```

## Available Models

The exact set of models you can call depends on your Copilot plan. The most
commonly available ones today are:

| Model | Best For |
| ------------------------ | ----------------------------------- |
| `gpt-4o` | Multimodal, balanced performance |
| `gpt-4o-mini` | Fast and cheap |
| `claude-sonnet-4` | Strong coding and analysis |
| `gemini-2.5-pro` | Google's flagship, large context |
| `o3-mini` | Reasoning-focused |

Check the
[GitHub Copilot documentation](https://docs.github.com/en/copilot)
for the current model list.

## `Copilot-Integration-Id` Header

GitHub's Copilot API rejects requests that don't carry a
`Copilot-Integration-Id` header with a `Bad Request` error. docker-agent
automatically sends a sensible default (`vscode-chat`) for the
`github-copilot` provider, so PAT-based usage works out of the box.

If you need to send a different integration id — for example if your
organization allows-lists a specific value — you can override it via
`provider_opts.http_headers`:

```yaml
models:
copilot:
provider: github-copilot
model: gpt-4o
provider_opts:
http_headers:
Copilot-Integration-Id: my-custom-integration
```

Header names are matched case-insensitively, so `copilot-integration-id`
works too.

## Custom HTTP Headers

`provider_opts.http_headers` is a generic escape hatch that works for any
OpenAI-compatible provider, not just GitHub Copilot. Every key/value pair
is added to every outgoing request:

```yaml
models:
my_model:
provider: openai
model: gpt-4o
provider_opts:
http_headers:
X-Request-Source: docker-agent
X-Tenant-Id: my-team
```

## How It Works

GitHub Copilot is implemented as a built-in alias in docker-agent:

- **API type:** OpenAI-compatible (Chat Completions)
- **Base URL:** `https://api.githubcopilot.com`
- **Token variable:** `GITHUB_TOKEN`
- **Default headers:** `Copilot-Integration-Id: vscode-chat`

This means the same client as OpenAI is used, so every OpenAI feature
supported by docker-agent (tool calling, structured output, multimodal
inputs, etc.) is available when the underlying model supports it.
13 changes: 7 additions & 6 deletions docs/providers/overview/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,12 +57,13 @@ _docker-agent supports multiple AI model providers. Choose the right one for you

docker-agent also includes built-in aliases for these providers:

| Provider | API Key Variable |
| ---------- | ----------------- |
| Mistral | `MISTRAL_API_KEY` |
| xAI (Grok) | `XAI_API_KEY` |
| Nebius | `NEBIUS_API_KEY` |
| MiniMax | `MINIMAX_API_KEY` |
| Provider | API Key Variable |
| --------------- | ----------------- |
| Mistral | `MISTRAL_API_KEY` |
| xAI (Grok) | `XAI_API_KEY` |
| Nebius | `NEBIUS_API_KEY` |
| MiniMax | `MINIMAX_API_KEY` |
| GitHub Copilot | `GITHUB_TOKEN` |

```bash
# Use built-in providers inline
Expand Down
18 changes: 18 additions & 0 deletions examples/github-copilot.yaml
Original file line number Diff line number Diff line change
@@ -1,9 +1,27 @@
#!/usr/bin/env docker agent run

# GitHub Copilot requires a `Copilot-Integration-Id` header on every request
# to https://api.githubcopilot.com. docker-agent sends a sensible default
# (`vscode-chat`) automatically, but you can override it (or add any other
# custom header) via `provider_opts.http_headers`.
#
# See https://github.com/docker/docker-agent/issues/2471

agents:
root:
model: github-copilot/gpt-4o
description: A helpful AI assistant powered by GitHub Copilot
instruction: |
You are a helpful AI assistant.
Be helpful, accurate, and concise in your responses.
models:
github-copilot/gpt-4o:
provider: github-copilot
model: gpt-4o
# Optional: override the default Copilot-Integration-Id header or add
# any other custom HTTP headers. Header names are matched
# case-insensitively against the default.
# provider_opts:
# http_headers:
# Copilot-Integration-Id: vscode-chat
14 changes: 14 additions & 0 deletions pkg/model/provider/openai/client.go
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,10 @@ func NewClient(ctx context.Context, cfg *latest.ModelConfig, env environment.Pro
clientOptions = append(clientOptions, option.WithBaseURL(cfg.BaseURL))
}

// Apply custom HTTP headers from provider_opts (e.g. github-copilot's
// required Copilot-Integration-Id) and any provider-specific defaults.
clientOptions = append(clientOptions, buildHeaderOptions(cfg)...)

httpClient := httpclient.NewHTTPClient(ctx)
clientOptions = append(clientOptions, option.WithHTTPClient(httpClient))

Expand Down Expand Up @@ -514,6 +518,9 @@ func (c *Client) createWebSocketStream(

// buildWSHeaderFn returns a function that produces the HTTP headers needed
// for the WebSocket handshake, including the Authorization header.
// buildWSHeaderFn returns a function that produces the HTTP headers needed
// for the WebSocket handshake, including the Authorization header and any
// custom headers from provider_opts.http_headers.
func (c *Client) buildWSHeaderFn() func(ctx context.Context) (http.Header, error) {
return func(ctx context.Context) (http.Header, error) {
h := http.Header{}
Expand All @@ -533,6 +540,13 @@ func (c *Client) buildWSHeaderFn() func(ctx context.Context) (http.Header, error
h.Set("Authorization", "Bearer "+apiKey)
}

// Apply custom headers from provider_opts (e.g. github-copilot's
// required Copilot-Integration-Id) and any provider-specific defaults.
// This ensures WebSocket connections have the same headers as HTTP.
for name, value := range buildHeaderMap(&c.ModelConfig) {
h.Set(name, value)
}

return h, nil
}
}
Expand Down
107 changes: 107 additions & 0 deletions pkg/model/provider/openai/headers.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,107 @@
package openai

import (
"log/slog"
"net/http"
"strings"

"github.com/openai/openai-go/v3/option"

"github.com/docker/docker-agent/pkg/config/latest"
)

// GitHub Copilot's API requires a Copilot-Integration-Id header to
// identify the client integration when authenticating with a GitHub
// token. Without it, requests to https://api.githubcopilot.com are
// rejected with "Bad Request". We default to the identifier used by
// other integrations that have been validated against the Copilot API.
//
// See https://github.com/docker/docker-agent/issues/2471
const (
copilotIntegrationIDHeader = "Copilot-Integration-Id"
copilotIntegrationIDDefault = "vscode-chat"
)

// buildHeaderOptions returns OpenAI client options for every custom
// HTTP header configured for the model, including provider-specific
// defaults.
//
// Users can set headers via provider_opts.http_headers:
//
// models:
// copilot:
// provider: github-copilot
// model: gpt-4o
// provider_opts:
// http_headers:
// Copilot-Integration-Id: vscode-chat
//
// For the github-copilot provider a default Copilot-Integration-Id is
// injected when the user has not set one. Header names are compared
// case-insensitively, so any user-provided header always overrides the
// default.
func buildHeaderOptions(cfg *latest.ModelConfig) []option.RequestOption {
headers := buildHeaderMap(cfg)
opts := make([]option.RequestOption, 0, len(headers))
for name, value := range headers {
opts = append(opts, option.WithHeader(name, value))
}
return opts
}

// buildHeaderMap returns a map of HTTP headers to send with requests,
// including provider-specific defaults and user-configured headers from
// provider_opts.http_headers. Header names are canonicalized for
// case-insensitive deduplication.
func buildHeaderMap(cfg *latest.ModelConfig) map[string]string {
// Canonicalizing keys de-duplicates headers case-insensitively:
// defaults are applied first, then user config clobbers conflicts.
headers := map[string]string{}
if cfg != nil && cfg.Provider == "github-copilot" {
headers[copilotIntegrationIDHeader] = copilotIntegrationIDDefault
}
for name, value := range userHeaders(cfg) {
headers[http.CanonicalHeaderKey(name)] = sanitizeHeaderValue(value)
}
return headers
}

// userHeaders parses provider_opts.http_headers into a simple string
// map. Malformed entries are logged and skipped so a typo doesn't
// silently reach the wire.
func userHeaders(cfg *latest.ModelConfig) map[string]string {
if cfg == nil || cfg.ProviderOpts == nil {
return nil
}
raw, ok := cfg.ProviderOpts["http_headers"]
if !ok || raw == nil {
return nil
}
rawMap, ok := raw.(map[string]any)
if !ok {
slog.Warn("provider_opts.http_headers must be a map of string to string, ignoring", "value", raw)
return nil
}
headers := make(map[string]string, len(rawMap))
for k, v := range rawMap {
s, ok := v.(string)
if !ok {
slog.Warn("provider_opts.http_headers value must be a string, ignoring", "header", k, "value", v)
continue
}
headers[k] = s
}
return headers
}

// sanitizeHeaderValue removes CR and LF characters from header values to
// prevent header injection attacks. HTTP header values must not contain
// newlines (RFC 7230 section 3.2).
func sanitizeHeaderValue(value string) string {
// Remove all CR and LF characters
value = strings.ReplaceAll(value, "\r", "")
value = strings.ReplaceAll(value, "\n", "")
// Also strip leading/trailing whitespace for cleanliness
value = strings.TrimSpace(value)
return value
}
Loading
Loading