OpenClaw on your server.
No compromises.

Hosted OpenClaw means trusting someone else with your API keys. getbot runs on your laptop and deploys OpenClaw onto your Linux server over SSH. Your API keys never touch getbot infrastructure.

terminal
$ curl -fsSL https://getbot.run/install.sh | bash
$ wget -qO- https://getbot.run/install.sh | bash
  • ✓ API keys never touch getbot infrastructure
  • ✓ OpenClaw runs on your Linux server
  • ✓ Auth is enforced on your VPS

Invite-only · Your laptop runs getbot · Your Linux server runs OpenClaw

Works with Claude, ChatGPT, Gemini, DeepSeek, and local models.

⚡ How it works

Your laptop runs getbot. Your Linux server runs OpenClaw.

1

Install the CLI on your laptop

One curl command. Downloads a single binary — no Docker, no Node, no server knowledge needed.

curl -fsSL https://getbot.run/install.sh | bash
2

Point it at your server

getbot connects over SSH and sets up everything — containers, reverse proxy, TLS certificates, authentication. Any Linux VPS, bare metal, or cloud VM with SSH access.

getbot setup

Requires an invite code

3

OpenClaw is live

Your own OpenClaw instance — running in a container on your server, secured with HTTPS, protected by Google SSO. You didn't touch a config file.

✓ OpenClaw deployed at https://acme.getbot.run/marketing

Why not hosted OpenClaw?

Hosted providers promise isolation. But the trust boundary still runs through their servers, not yours.

Hosted OpenClaw
API keys
Your keys are entered into provider-controlled infrastructure.
Runtime
The provider controls the servers, the network, and what runs alongside your agent.
Auth
Authentication is handled entirely on the provider's infrastructure.
Trust boundary
Providers promise "isolation" and "privacy" — but the boundary runs through their infrastructure, not yours.
Exit path
Runtime and state stay under provider control. Leaving means starting over.
Setup effort
Sign up, paste your API key, and you're running in minutes.
OpenClaw with getbot
API keys
Sent directly to your server over SSH. They never touch getbot infrastructure.
Runtime
You control the server. Each org runs in its own Incus container with Docker inside.
Auth
Google OAuth triggers JWT minting on your VPS. Caddy enforces auth at your network edge.
Trust boundary
The trust boundary is your server. You can SSH in and inspect every running process.
Exit path
Everything runs on your server. SSH in, inspect it, move it, or walk away.
Setup effort
Two CLI commands, one Linux server. More steps — but you own the result.

Yes, getbot requires a Linux server. A $6/month VPS is all you need. The tradeoff is a few extra minutes for a trust boundary you can actually verify.

🛡️ Architecture at a glance

Three layers of isolation between the internet and your AI agent.

🌐 Internet
HTTPS requests
🔒 Caddy Reverse Proxy
TLS termination + forward_auth
🔐 Auth Layer
Google SSO → JWT minting on your VPS
📦 Incus Container
Isolated environment per organization
🤖 OpenClaw AI Agent
Your keys, your model, your data

🛠️ Built for trust

Every decision serves one principle: minimize trust boundaries.

🔒

Container Isolation

Each organization runs in its own Incus container with Docker-in-Incus. No shared runtimes, no container escapes to the host.

🛡️

Trust-Minimized Auth

Google OAuth via auth.getbot.run, JWT minting on your VPS, Caddy forward_auth — your AI agent never sees raw credentials.

🤖

Any LLM Provider

Choose your AI provider during setup. Switch between Claude, ChatGPT, Gemini, or self-hosted models without redeploying.

🖥️

SSH-First Deploy

Point getbot at any server with SSH access. No cloud provider accounts required — works on bare metal, VPS, or existing infra.

🔐

Your Keys Stay Home

API keys are injected directly into your container via SSH. They never touch getbot servers, never appear in logs, never leave your VPS.

Zero Lock-in

Everything is open source. Uninstall getbot and your server is unchanged. No proprietary agents, no phone-home telemetry.

🛡️ The OpenClaw security model

When you self-host OpenClaw, your trust boundary shrinks to one: your own server.

🔐

Your server, your keys

API keys stay on your VPS. They never touch getbot servers, never leave your infrastructure, never appear in logs.

📦

Process isolation via Incus

Each organization gets its own Incus container with Docker inside. A compromised agent cannot reach the host or other tenants.

🔒

No ambient credentials

Auth flows through Google OAuth → JWT minting on your VPS → Caddy forward_auth. The AI agent never handles authentication tokens.

Zero vendor lock-in

Switch LLM providers without redeploying. Uninstall getbot and your server is unchanged. No proprietary agents, no phone-home telemetry.

📖 Documentation

Step-by-step guides covering every stage of a getbot deployment.

Browse all docs →

How access works

CLI and docs are public

Install the CLI, read the docs, and explore the architecture. No approval needed.

🔒

Deployment requires an invite code

getbot setup asks for an invite code. Request one below or get one from an existing user.

🔐

Your agent and API keys stay on your server

Keys are injected into your container via SSH. They never touch getbot infrastructure.

🛡️

Auth runs through getbot-managed infrastructure

Google OAuth flows through auth.getbot.run. Early users may use shared auth and DNS patterns while we harden per-org setup.

📝 How we secure self-hosted OpenClaw

Container isolation, trust-minimized auth, and zero ambient credentials. By design, not by accident.

Read the post →

Request an invite code

Deploy OpenClaw on your own Linux server. Your API keys are sent directly to your server over SSH, not to ours. The CLI is free to install. Deployment requires an invite code.

We review each request individually. Codes are sent within 1–2 business days.

We'll only email you about your invite code.