Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/marketplace-docs/guides/gemma3/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Before deployment, you need a Hugging Face API token to access the Gemma 3 model
1. Create a free account at [huggingface.co/join](https://huggingface.co/join).
1. Accept the Gemma license at [huggingface.co/google/gemma-3-12b-it](https://huggingface.co/google/gemma-3-12b-it).
1. Generate a token at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read-only access is sufficient.
1. Provide this token during the Marketplace deployment process.
1. Provide this token during the deployment process.

{{% content "marketplace-required-limited-user-fields-shortguide" %}}

Expand Down
8 changes: 4 additions & 4 deletions docs/marketplace-docs/guides/gpt-oss-with-openwebui/index.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: "Deploy GPT-OSS with Open WebUI through the Linode Marketplace"
title: "Deploy GPT-OSS with Open WebUI"
description: "This guide includes instructions on how to deploy Open WebUI with GPT-OSS self-hosted LLM on an Akamai Compute Instance."
published: 2026-02-12
modified: 2026-02-12
Expand All @@ -19,9 +19,9 @@ marketplace_app_name: "GPT-OSS with Open WebUI"

Open WebUI is an open-source, self-hosted web interface for interacting with and managing Large Language Models (LLMs). It supports multiple AI backends, multi-user access, and extensible integrations, enabling secure and customizable deployment for local or remote model inference.

The Marketplace application deployed in this guide uses OpenAI GPT-OSS, a family of open-weight large language models designed for powerful reasoning, agentic tasks, and versatile developer use cases. During deployment, you can choose between two model sizes: GPT-OSS 20B (default) or GPT-OSS 120B. These models are released under the permissive Apache 2.0 license and integrate well with self-hosted platforms like Open WebUI for general-purpose assistance, coding, and knowledge-based workflows.
The Quick Deploy App deployed in this guide uses OpenAI GPT-OSS, a family of open-weight large language models designed for powerful reasoning, agentic tasks, and versatile developer use cases. During deployment, you can choose between two model sizes: GPT-OSS 20B (default) or GPT-OSS 120B. These models are released under the permissive Apache 2.0 license and integrate well with self-hosted platforms like Open WebUI for general-purpose assistance, coding, and knowledge-based workflows.

## Deploying a Marketplace App
## Deploying a Quick Deploy App

{{% content "deploy-marketplace-apps-shortguide" %}}

Expand All @@ -37,7 +37,7 @@ Open WebUI with GPT-OSS should be fully installed within 5-10 minutes after the
- **Recommended plan for GPT-OSS 120B:** RTX4000 Ada x1 Large or higher (64GB RAM minimum)

{{< note type="warning" >}}
This Marketplace App only works with Akamai GPU instances. If you choose a plan other than GPUs, the provisioning will fail, and a notice will appear in the LISH console.
This Quick Deploy App only works with Akamai GPU instances. If you choose a plan other than GPUs, the provisioning will fail, and a notice will appear in the LISH console.
{{< /note >}}

### GPT-OSS Options
Expand Down
137 changes: 137 additions & 0 deletions docs/marketplace-docs/guides/openclaw/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,137 @@
---
title: "Deploy OpenClaw"
description: "This tutorial will show you how deploy OpenClaw as a Quick Deploy App."
published: 2026-03-17
modified: 2026-03-17
keywords: ['AI', 'AI Agent']
tags: ["quick deploy apps", "AI", "AI Agent"]
aliases: ['/products/tools/marketplace/guides/openclaw/','/guides/openclaw/']
external_resources:
- '[OpenClaw](https://openclaw.ai/)'
- '[OpenClaw Documentation](https://docs.openclaw.ai/)'
authors: ["Akamai"]
contributors: ["Akamai"]
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
marketplace_app_id: 2049320
marketplace_app_name: "OpenClaw"
---

[OpenClaw](https://openclaw.ai/) is an open-source AI agent platform that runs locally and executes tasks through a persistent Gateway service. The Gateway connects communication channels, tools, and AI models, allowing the agent to receive messages, perform actions, and automate workflows. Administrators configure and manage the system through a CLI onboarding wizard and a local web dashboard. Our Quick Deploy App allows you to connect to the OpenClaw dashboard via a secure HTTPS endpoint protected by HTPASSWD.

This Quick Deploy App creates an OpenClaw limited user on the system called `openclaw`.

## Deploying a Quick Deploy App

{{% content "deploy-marketplace-apps-shortguide" %}}

{{% content "marketplace-verify-standard-shortguide" %}}

{{< note >}}
**Estimated deployment time:** OpenClaw should be fully installed within 5-10 minutes after the Compute Instance has finished provisioning.
{{< /note >}}

## Configuration Options

- **Supported distributions:** Ubuntu 24.04 LTS
- **Recommended plan:** All plan types and sizes can be used.

## OpenClaw Options

- **Email address** *(required)*: Enter the email address you want to use for generating the SSL certificates via Let's Encrypt.

{{% content "marketplace-required-limited-user-fields-shortguide" %}}

{{% content "marketplace-custom-domain-fields-shortguide" %}}

{{% content "marketplace-special-character-limitations-shortguide" %}}

## Getting Started after Deployment

### Performing OpenClaw Onboard

Once the deployment is complete, `openclaw` will be installed on the instance but will not be running. Before you can start using OpenClaw, you will need to get through the onboarding wizard. This Quick Deploy App will trigger the onboarding for you when you log in as root.

1. Log into the instance

If you disabled root login to the server during the setup of the OpenClaw app, you will need to log into the server as the sudo user.

```command
ssh admin@YOUR_INSTANCE_IP
```

Replace `YOUR_INSTANCE_IP` with the IP address of your Linode and `admin` with the sudo user you created.

2. Escalate privileges to root

Once you've logged in, you will notice the MoTD.
```output
*********************************************************
Akamai Connected Cloud OpenClaw Quick Deploy App
Dashboard URL: https://172-235-150-14.ip.linodeusercontent.com
Credentials File: /home/admin/.credentials
Documentation: https://www.linode.com/docs/marketplace-docs/guides/openclaw/
*********************************************************
```

Copy the sudo password from `~/.credentials.txt` and enter the following command from the terminal:

```command
sudo su -
```

When prompted for the password, paste the sudo password you just got from the `~/.credentials.txt` file. When you log in as **root**, you will notice the following message.

![OpenClaw Init](openclaw-init.jpg)

If you are ready to perform the onboarding, enter `y` and it will take you to OpenClaw's onboarding wizard where you can complete the setup.

![OpenClaw Onboard](openclaw-onboard.jpg)

Once the onboarding is complete the onboarding script will be removed.

### Confirm Gateway Status

At this time, you've configured OpenClaw on the server. To verify that the gateway is running, you will need to become the `openclaw` user. Enter the following from the terminal and the **root** user:

```command
su - openclaw
```

To view the gateway, status enter the following as the **openclaw** user:

```command
openclaw gateway status
```

That should yield the following output:

![OpenClaw GW Status](openclaw-gws.jpg)

### Dashboard Access

Once the onboarding is complete and the gateway is running, you will be able to access the Dashboard from the domain you've configured in the initial deployment of the app. If you did not enter a domain name in from the start, the dashboard will be accessible via the instance's rDNS value. You can view the rDNS value from the [Linode's Network](https://techdocs.akamai.com/cloud-computing/docs/configure-rdns-reverse-dns-on-a-compute-instance#setting-reverse-dns) tab.

In this example, we'll use the domain `172-233-177-79.ip.linodeusercontent.com`.

To authenticate to the dashboard you will need to provide two methods of authentication:
1. Dashboard token:
- If you didn't grab this from the onboarding earlier you will need to follow the next steps.
1. Become the `openclaw` user:
`su - openclaw`.
2. Run the following next:
`openclaw dashboard --no-open`
3. Grab the entire token value `#token=a0764fb` from the `Dashboard URL:` link
2. Nginx Basic Auth:
- Grab the `Htpassword` password and `Htpasswd username` user from `/home/admin.credentials`.

At this point you have everything you need to access the dashboard. For example:

`https://172-233-177-79.ip.linodeusercontent.com/#token=a0764fb`

When you access the web page you will be prompted for the HTPASSWD details.

![Nginx Basic Auth](openclaw-htpasswd.jpg)

Enter the Username as **openclaw** and the Password you got from the `/home/admin.credentials` file.

{{% content "marketplace-update-note-shortguide" %}}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 2 additions & 2 deletions docs/marketplace-docs/guides/qwen/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ marketplace_app_name: "Qwen Instruct with Open WebUI"

Open WebUI is an open-source, self-hosted web interface for interacting with and managing Large Language Models (LLMs). It supports multiple AI backends, multi-user access, and extensible integrations, enabling secure and customizable deployment for local or remote model inference.

The Marketplace application deployed in this guide uses a Qwen Instruct model as an instruction-tuned, open-weight LLM optimized for reasoning, code generation, and conversational tasks. Qwen models are designed for high-quality inference across a wide range of general-purpose and technical workloads and integrate seamlessly with self-hosted platforms like Open WebUI.
The Quick Deploy App deployed in this guide uses a Qwen Instruct model as an instruction-tuned, open-weight LLM optimized for reasoning, code generation, and conversational tasks. Qwen models are designed for high-quality inference across a wide range of general-purpose and technical workloads and integrate seamlessly with self-hosted platforms like Open WebUI.

## Deploying a Quick Deploy App

Expand All @@ -35,7 +35,7 @@ Open WebUI with Qwen Instruct should be fully installed within 5-10 minutes afte
- **Recommended plan:** RTX4000 Ada x1 Small or Larger GPU Instance

{{< note type="warning" >}}
This Marketplace App only works with Akamai GPU instances. If you choose a plan other than GPUs, the provisioning will fail, and a notice will appear in the LISH console.
This Quick Deploy App only works with Akamai GPU instances. If you choose a plan other than GPUs, the provisioning will fail, and a notice will appear in the LISH console.
{{< /note >}}

### Open WebUI Options
Expand Down
Loading