fix(ai): allow Ollama Cloud API key in AI settings#4262
fix(ai): allow Ollama Cloud API key in AI settings#4262VincentEmmanuel wants to merge 17 commits intoDokploy:mainfrom
Conversation
Update the GitHub Actions workflow to bump the version in package.json after installing dependencies, ensuring that the version is not overwritten by pnpm install. This change enhances the reliability of version synchronization for both MCP and CLI repositories.
Enhance the GitHub Actions workflow by adding a workflow_dispatch trigger, allowing manual execution of the version synchronization process. This provides greater flexibility in managing version updates for MCP and CLI repositories.
…LI repositories Modify the GitHub Actions workflow to clone the MCP and CLI repositories into temporary directories instead of the current directory. This change improves the organization of the workflow and ensures that the latest OpenAPI specification is correctly referenced during the synchronization process.
Update the version retrieval command in the GitHub Actions workflow to strip the 'v' prefix from the version number in package.json. This change ensures that the version format is consistent for downstream processes.
style: Fix typo in custom entrypoint description
Adds a new /dashboard/home landing with welcome header, KPI cards (deploys/24h, build, CPU, memory) and a recent deployments list. Home is now the post-login landing and the destination for permission fallback redirects across the app. Projects remains accessible from the sidebar.
- Home: 4 KPI cards (projects, services, deploys/7d, status list), server column with icon in recent deployments, empty state with icon, dashboard card frame to match other pages. - Include libsql in project services count sort. - Fix bulk actions in environment page: libsql was missing from start, stop, move, delete and deploy handlers.
- Replace individual project and server queries with a consolidated homeStats query to streamline data retrieval for the dashboard. - Update the ShowHome component to utilize homeStats for displaying project, environment, application, and service counts, along with their status breakdown. - Enhance data handling for user permissions to ensure accurate statistics based on user access levels.
- Adjusted the Card component to have a minimum height of 85vh for better visual consistency. - Ensured the inner div has a full height to enhance the layout structure.
feat: add dashboard home page
…postgres 100-arg limit Closes Dokploy#4256
…yment-too-many-args fix: preview deployments broken on v0.29.0 — postgres 100-arg limit
| }, | ||
| { | ||
| enabled: !!apiUrl && (isOllama || !!apiKey), | ||
| enabled: !!apiUrl && (isLocalOllama || !!apiKey), |
There was a problem hiding this comment.
Model listing broken for unauthenticated non-localhost Ollama
The enabled gate isLocalOllama || !!apiKey is false for any Ollama instance that isn't localhost/127.0.0.1 and has no API key — e.g. a LAN instance at http://192.168.1.100:11434. Those users will see the API Key field (now shown), leave it blank (no auth needed), and the models query will never fire. The model dropdown stays empty and they must type the model name manually without knowing why.
A simple fix is to also skip the gate for the :11434 port pattern, mirroring the original condition's intent:
| enabled: !!apiUrl && (isLocalOllama || !!apiKey), | |
| enabled: !!apiUrl && (isLocalOllama || apiUrl.includes(":11434") || !!apiKey), |
Or, alternatively, detect the pattern apiUrl.includes(":11434") && !apiKey as "unauthenticated Ollama" for gating purposes, separate from the API-key visibility decision.
| const isLocalOllama = | ||
| apiUrl.includes("localhost:11434") || | ||
| apiUrl.includes("127.0.0.1:11434"); |
There was a problem hiding this comment.
isLocalOllama doesn't match http://[::1]:11434, so a user running Ollama on the IPv6 loopback will see the API Key field (and the model listing gate will block fetching unless they type a key). This is an edge case but easy to handle:
| const isLocalOllama = | |
| apiUrl.includes("localhost:11434") || | |
| apiUrl.includes("127.0.0.1:11434"); | |
| const isLocalOllama = | |
| apiUrl.includes("localhost:11434") || | |
| apiUrl.includes("127.0.0.1:11434") || | |
| apiUrl.includes("[::1]:11434"); |
The Ollama detection matched any URL containing "ollama", which hides the API Key field for Ollama Cloud (ollama.com) and drops the key from the createOllama() client, so cloud requests go out unauthenticated. Narrow the rule to localhost-only Ollama and forward the API key as a Bearer header when provided. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
9f3736c to
beb51e4
Compare
Summary
The Ollama provider detection in AI Settings matches any URL containing the substring
ollama, which hides the API Key input for Ollama Cloud (ollama.com) and drops the key from thecreateOllama()client. As a result, Ollama Cloud cannot be used — requests go out unauthenticated and fail.This PR:
localhost:11434/127.0.0.1:11434), so the API Key field is visible for Ollama Cloud and any self-hosted Ollama behind an auth proxy.Authorization: Bearer ...header tocreateOllama, using theheadersoption supported byai-sdk-ollamav3.7.The
getModelsollama branch in the AI router already builds its headers viagetProviderHeaders(apiUrl, apiKey), so once the key reaches the backend it authenticates/api/tagscorrectly — no router change needed.Repro (before fix)
ollama)After fix
Authorization: Bearer <key>on both model listing and completion requestsFiles changed
apps/dokploy/components/dashboard/settings/handle-ai.tsx— replaceisOllamasubstring check withisLocalOllama(localhost-only)packages/server/src/utils/ai/select-ai-provider.ts— passheaderswith Bearer token tocreateOllamaTest plan
http://localhost:11434) still works with no API key entered🤖 Generated with Claude Code
Greptile Summary
This PR fixes Ollama Cloud authentication by narrowing the "no API key needed" detection from any URL containing
ollamaor:11434to strictlylocalhost/127.0.0.1on port 11434, and forwards the API key as aBearerheader tocreateOllama.enabledgate on thegetModelsquery (isLocalOllama || !!apiKey) now blocks model listing for any non-localhost Ollama that has no API key (e.g.http://192.168.1.100:11434). Those instances don't need auth but will silently get no models in the dropdown.Confidence Score: 4/5
Safe to merge for the primary Ollama Cloud use case, but introduces a regression for unauthenticated non-localhost Ollama instances.
The core fix (showing the API key field for Ollama Cloud and forwarding it as a Bearer token) is correct. However, the
enabledgate for model listing now blocks model fetching for unauthenticated Ollama instances that aren't on localhost (e.g. LAN or remote Ollama without auth), which is a real UX regression from the previous behavior.apps/dokploy/components/dashboard/settings/handle-ai.tsx — specifically the
enabledcondition on thegetModelsqueryReviews (1): Last reviewed commit: "fix(ai): allow Ollama Cloud API key in A..." | Re-trigger Greptile