Getting Started
7 min read
Get nclaw running in under 5 minutes.
Step 1: Create a Telegram Bot
- Open Telegram and search for @BotFather (or open t.me/BotFather).
- Send
/newbot. - Choose a display name for your bot (e.g. “My Coding Assistant”).
- Choose a username — must end in
bot(e.g.my_coding_assistant_bot). - BotFather replies with your bot token — a string like
123456789:ABCdefGhIjKlMnOpQrStUvWxYz. Save it.
Tip: You can customize the bot later — send
/mybotsto BotFather to change the name, description, profile picture, and more.
If you want the bot in a group with topics (one topic per project), also configure these via BotFather:
- Send
/mybots→ select your bot → Bot Settings → Group Privacy → Turn off. This lets the bot read all messages in group chats, not just commands. - Send
/setjoingroups→ select your bot → Enable. This allows adding the bot to groups.
Step 2: Find Your Chat ID
NClaw uses NCLAW_TELEGRAM_WHITELIST_CHAT_IDS to restrict which chats the bot responds in. This setting is optional, but strongly recommended — without it, anyone who discovers your bot can send it commands with full access to the container’s file system, shell, and network. You need the numeric chat ID.
For a private chat (1-on-1 with the bot):
- Message your bot (send anything — it won’t reply yet).
- Open this URL in a browser, replacing
<TOKEN>with your bot token:https://api.telegram.org/bot<TOKEN>/getUpdates - Find
"chat":{"id":123456789}in the JSON response. That number is your chat ID.
For a group chat:
- Add the bot to the group.
- Send a message in the group.
- Use the same
getUpdatesURL above. The group chat ID is a negative number (e.g.-1001234567890).
Tip: You can whitelist multiple chat IDs by separating them with commas:
123456789,-1001234567890.
Step 3: Run NClaw
The fastest way to get started is with the multi-model image using a free Gemini API key:
Get a free API key from Google AI Studio.
Run:
docker run -d --name nclaw \ -e NCLAW_TELEGRAM_BOT_TOKEN=your-bot-token \ -e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \ -e NCLAW_DATA_DIR=/app/data \ -e NCLAW_MODEL=[email protected] \ -e GEMINI_API_KEY=your-gemini-key \ -v ./data:/app/data \ ghcr.io/nickalie/nclaw:multi-modelMessage your bot in Telegram — it should reply.
To use Claude Code instead (requires an Anthropic account with Claude Code access):
Install Claude Code and authenticate:
curl -fsSL https://claude.ai/install.sh | bash claude loginRun:
docker run -d --name nclaw \ -e NCLAW_TELEGRAM_BOT_TOKEN=your-bot-token \ -e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \ -e NCLAW_DATA_DIR=/app/data \ -v ./data:/app/data \ -v ~/.claude/.credentials.json:/root/.claude/.credentials.json:ro \ ghcr.io/nickalie/nclaw:claude
See Docker for all image variants and Configuration for the full list of options.
Docker
The recommended way to run nclaw. The container serves as a security sandbox, and the image ships with all the tools the assistant might need.
NClaw provides six Docker images, all based on node:24-alpine with shared tools (git, gh CLI, Chromium, Go, Node.js, Python/uv, skills). They differ only in which CLI agent is pre-installed:
| Image | Tag | CLI Backends | Size |
|---|---|---|---|
| All-in-one | latest | Claude Code + Multi-Model + Codex + Copilot + Gemini | Largest |
| Claude | claude | Claude Code | Medium |
| Multi-Model | multi-model | Claude Code + Multi-Model | Medium |
| Codex | codex | OpenAI Codex | Medium |
| Copilot | copilot | GitHub Copilot | Medium |
| Gemini | gemini | Google Gemini CLI | Medium |
All images are published to ghcr.io/nickalie/nclaw and built for linux/amd64 and linux/arm64. Docker automatically pulls the correct architecture — no extra flags needed. This means you can run nclaw on:
- Raspberry Pi (4/5 or any arm64 board) — a dedicated AI coding assistant on a $35 device
- AWS Graviton instances — lower cost and better price-performance than x86
- Apple Silicon Macs — native arm64 without Rosetta emulation
- Oracle Cloud Ampere or any other arm64 cloud VM
The assistant can install additional packages at runtime (e.g. apk add ffmpeg, pip install pandas, npm install -g typescript).
Claude (default)
docker run -d --name nclaw \
-e NCLAW_TELEGRAM_BOT_TOKEN=your-token \
-e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \
-e NCLAW_DATA_DIR=/app/data \
-v ./data:/app/data \
-v ~/.claude/.credentials.json:/root/.claude/.credentials.json:ro \
ghcr.io/nickalie/nclaw:claude
Claude Code uses OAuth authentication. Mount your credentials file from ~/.claude/.credentials.json. To obtain credentials, install Claude Code locally and run claude login.
Multi-Model
docker run -d --name nclaw \
-e NCLAW_TELEGRAM_BOT_TOKEN=your-token \
-e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \
-e NCLAW_DATA_DIR=/app/data \
-e NCLAW_MODEL=[email protected] \
-e GEMINI_API_KEY=your-gemini-key \
-v ./data:/app/data \
ghcr.io/nickalie/nclaw:multi-model
docker run -d --name nclaw \
-e NCLAW_TELEGRAM_BOT_TOKEN=your-token \
-e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \
-e NCLAW_DATA_DIR=/app/data \
-e NCLAW_MODEL=zai@glm-4 \
-e ZAI_API_KEY=your-zai-key \
-v ./data:/app/data \
ghcr.io/nickalie/nclaw:multi-model
Setting NCLAW_MODEL automatically selects the multi-model backend. No Anthropic credentials are needed — only an API key from your chosen provider. See Multi-Model for the full list of providers and models.
Codex
docker run -d --name nclaw \
-e NCLAW_TELEGRAM_BOT_TOKEN=your-token \
-e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \
-e NCLAW_DATA_DIR=/app/data \
-e NCLAW_CLI=codex \
-v ./data:/app/data \
-v ~/.codex/auth.json:/root/.codex/auth.json:ro \
ghcr.io/nickalie/nclaw:codex
Codex uses ChatGPT OAuth authentication. Mount your auth file from ~/.codex/auth.json. To obtain credentials, install Codex locally (npm install -g @openai/codex) and sign in on first run.
Copilot
docker run -d --name nclaw \
-e NCLAW_TELEGRAM_BOT_TOKEN=your-token \
-e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \
-e NCLAW_DATA_DIR=/app/data \
-e NCLAW_CLI=copilot \
-v ./data:/app/data \
-v ~/.copilot/config.json:/root/.copilot/config.json:ro \
ghcr.io/nickalie/nclaw:copilot
Copilot uses GitHub OAuth authentication. Mount your config file from ~/.copilot/config.json. To obtain credentials, install Copilot CLI locally (npm install -g @githubnext/github-copilot-cli) and run /login.
Gemini
docker run -d --name nclaw \
-e NCLAW_TELEGRAM_BOT_TOKEN=your-token \
-e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \
-e NCLAW_DATA_DIR=/app/data \
-e NCLAW_CLI=gemini \
-v ./data:/app/data \
-v ~/.gemini/oauth_creds.json:/root/.gemini/oauth_creds.json:ro \
ghcr.io/nickalie/nclaw:gemini
Gemini CLI uses Google account OAuth authentication. Mount your credentials file from ~/.gemini/oauth_creds.json. To obtain credentials, install Gemini CLI locally (npm install -g @google/gemini-cli) and sign in on first run.
All-in-one
docker run -d --name nclaw \
-e NCLAW_TELEGRAM_BOT_TOKEN=your-token \
-e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \
-e NCLAW_DATA_DIR=/app/data \
-v ./data:/app/data \
-v ~/.claude/.credentials.json:/root/.claude/.credentials.json:ro \
ghcr.io/nickalie/nclaw:latest
The all-in-one image includes all five CLI agents. Set NCLAW_CLI to claude (default), claudish (multi-model), codex, copilot, or gemini to choose the agent. Mount the appropriate credentials for your chosen agent.
Webhooks
To enable webhooks, add the webhook base domain and expose the port:
docker run -d --name nclaw \
-e NCLAW_TELEGRAM_BOT_TOKEN=your-token \
-e NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id \
-e NCLAW_DATA_DIR=/app/data \
-e NCLAW_WEBHOOK_BASE_DOMAIN=example.com \
-e NCLAW_WEBHOOK_PORT=:3000 \
-p 3000:3000 \
-v ./data:/app/data \
-v ~/.claude/.credentials.json:/root/.claude/.credentials.json:ro \
ghcr.io/nickalie/nclaw:latest
Kubernetes (Helm)
The Helm chart is published as an OCI artifact to GHCR. Since all Docker images are multi-arch (amd64/arm64), the chart works on mixed-architecture clusters — including AWS Graviton node pools, Raspberry Pi k3s clusters, and Apple Silicon dev machines.
helm install nclaw oci://ghcr.io/nickalie/charts/nclaw \
--set env.telegramBotToken=your-token \
--set env.whitelistChatIds=your-chat-id \
--set claudeCredentialsSecret=my-claude-secret
Create the credentials secret for your chosen agent:
# Claude
kubectl create secret generic my-claude-secret \
--from-file=credentials.json=$HOME/.claude/.credentials.json
# Codex
kubectl create secret generic my-codex-secret \
--from-file=auth.json=$HOME/.codex/auth.json
# Copilot
kubectl create secret generic my-copilot-secret \
--from-file=config.json=$HOME/.copilot/config.json
# Gemini
kubectl create secret generic my-gemini-secret \
--from-file=oauth_creds.json=$HOME/.gemini/oauth_creds.json
See the Configuration page for the full list of Helm values and GitOps Deployment for FluxCD/ArgoCD manifests.
Running without Docker
nclaw is a regular executable and can run directly on any machine. The only runtime dependency is the CLI for your chosen agent — Claude Code (default), claudish (multi-model), OpenAI Codex, GitHub Copilot, or Gemini CLI — it must be installed and available in PATH.
Security notice: Without Docker, the CLI agent runs directly on the host with the same permissions as the nclaw process. It has full access to the file system, network, and any credentials available to the user. Run under a dedicated unprivileged user and avoid running as root. For production use, Docker or Kubernetes deployment is strongly recommended.
Installation
Homebrew (macOS/Linux)
brew install --cask nickalie/apps/nclaw
Scoop (Windows)
scoop bucket add nickalie https://github.com/nickalie/scoop-bucket
scoop install nclaw
Chocolatey (Windows)
choco install nclaw
Winget (Windows)
winget install nickalie.nclaw
AUR (Arch Linux)
yay -S nclaw-bin
DEB / RPM / APK
Download the appropriate package from the Releases page:
# Debian/Ubuntu
sudo dpkg -i nclaw_*.deb
# Fedora/RHEL
sudo rpm -i nclaw_*.rpm
# Alpine
sudo apk add --allow-untrusted nclaw_*.apk
Binary download
Pre-built binaries for Linux, macOS, and Windows (amd64/arm64) are available on the Releases page.
Go install
CGO_ENABLED=1 go install github.com/nickalie/nclaw/cmd/nclaw@latest
Requires Go 1.25+ and a C compiler (CGO is needed for SQLite).
Usage
- Install Claude Code CLI and authenticate:
curl -fsSL https://claude.ai/install.sh | bash claude login - Create a
.envfile or export environment variables:export NCLAW_TELEGRAM_BOT_TOKEN=your-token export NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=your-chat-id export NCLAW_DATA_DIR=./data - Run:
nclaw
Any tools you want the assistant to use (git, gh, python, etc.) should be installed on the host. The assistant will use whatever is available in the system PATH.
Next Steps
- Configure your bot settings
- Learn about features
- Set up a multi-model backend
- Explore built-in skills