Quick Start¶
Deploy OpenClaw or OpenCode in under 5 minutes.
Prerequisites¶
A cluster with Language Operator installed and access to a large language model.
Step 1: Create a Cluster¶
A LanguageCluster is a managed namespace for logically grouped agents, models, and tools.
kubectl apply -f - <<EOF
apiVersion: langop.io/v1alpha1
kind: LanguageCluster
metadata:
name: language-operator-demo
spec:
domain: agents.example.com
EOF
Wait for it to be ready:
Switch into its namespace:
Step 2: Configure an LLM¶
Store your API key in a secret:
Create a LanguageModel pointing to it:
kubectl apply -f - <<EOF
apiVersion: langop.io/v1alpha1
kind: LanguageModel
metadata:
name: claude-sonnet
spec:
provider: anthropic
modelName: claude-sonnet-4-5
apiKeySecretRef:
name: anthropic-credentials
key: api-key
EOF
Step 3: Deploy an Agent¶
Choose one of the bundled runtimes:
Step 4: Check Status¶
Watch the agent pod come up:
Step 5: Access the Agent¶
Retrieve the auto-generated gateway token:
TOKEN=$(kubectl get secret openclaw-runtime -o jsonpath='{.data.OPENCLAW_GATEWAY_TOKEN}' | base64 -d)
echo "Token: $TOKEN
If you have a domain configured on your LanguageCluster, open https://openclaw.<cluster-domain> and enter the token when prompted, or connect the OpenClaw CLI directly to wss://openclaw.<cluster-domain>.
Otherwise, port-forward for local access:
kubectl port-forward svc/openclaw 18789:18789
# connect to ws://localhost:18789 with the token above
OpenClaw uses a WebSocket gateway on port 18789. Connect using the OpenClaw browser extension or CLI client (see github.com/openclaw/openclaw).
Retrieve the auto-generated credentials:
USERNAME=$(kubectl get secret opencode-runtime -o jsonpath='{.data.OPENCODE_SERVER_USERNAME}' | base64 -d)
PASSWORD=$(kubectl get secret opencode-runtime -o jsonpath='{.data.OPENCODE_SERVER_PASSWORD}' | base64 -d)
echo "username: $USERNAME password: $PASSWORD"
If you have a domain configured on your LanguageCluster, open https://opencode.<cluster-domain> and sign in with the credentials above.
To attach the TUI (OpenCode v1.0.10+):
Otherwise, port-forward for local access:
Then open http://localhost:3000 or attach the TUI:
What Just Happened?¶
The operator automatically:
- Created a dedicated namespace (
language-operator-demo) - Deployed a LiteLLM proxy with your model credentials
- Configured and deployed your agent