Configuration
3 min read
nclaw reads configuration from environment variables, .env files, or YAML config files.
Environment Variables
NClaw variables use the NCLAW_ prefix. Provider API keys use the provider’s native env var name (no prefix) — they pass through to the multi-model backend automatically.
| Variable | Required | Default | Description |
|---|---|---|---|
NCLAW_TELEGRAM_BOT_TOKEN | Yes | — | Telegram bot token from @BotFather |
NCLAW_DATA_DIR | Yes | — | Base directory for session data and files |
NCLAW_CLI | No | claude | CLI agent: claude, claudish (multi-model), codex, copilot, or gemini. Auto-selects claudish when NCLAW_MODEL is set |
NCLAW_MODEL | No | — | Model for multi-model backend (e.g. [email protected]). Setting this auto-selects multi-model |
NCLAW_TELEGRAM_WHITELIST_CHAT_IDS | No | — | Comma-separated list of allowed Telegram chat IDs. If unset, accepts all chats (with a security warning) |
NCLAW_DB_PATH | No | {data_dir}/nclaw.db | Path to the SQLite database |
NCLAW_TIMEZONE | No | system local | Timezone for the scheduler (e.g. Europe/Berlin) |
NCLAW_WEBHOOK_BASE_DOMAIN | No | — | Base domain for webhook URLs (required when using webhooks) |
NCLAW_WEBHOOK_PORT | No | :3000 | Webhook HTTP server listen address |
Security notice: If
NCLAW_TELEGRAM_WHITELIST_CHAT_IDSis not set, the assistant will accept messages from any Telegram chat. Since nclaw runs the CLI agent with full tool access (file system, shell, network), this effectively gives anyone who discovers your bot unrestricted access to the host environment. Always set this variable in production.
Provider API Keys
When using the multi-model backend, set the API key for your chosen provider as a regular environment variable (without the NCLAW_ prefix):
| Variable | Provider |
|---|---|
OPENROUTER_API_KEY | OpenRouter |
GEMINI_API_KEY | Google Gemini |
OPENAI_API_KEY | OpenAI |
VERTEX_API_KEY | Vertex AI |
OLLAMA_API_KEY | OllamaCloud |
MOONSHOT_API_KEY | Kimi |
ZHIPU_API_KEY | GLM (Zhipu) |
ZAI_API_KEY | Z.AI |
MINIMAX_API_KEY | MiniMax |
POE_API_KEY | Poe |
Local providers (Ollama, LM Studio, vLLM, MLX) don’t require API keys. Set base URL variables to connect to custom endpoints:
| Variable | Provider | Default |
|---|---|---|
OLLAMA_BASE_URL | Ollama | http://localhost:11434 |
LMSTUDIO_BASE_URL | LM Studio | http://localhost:1234 |
VLLM_BASE_URL | vLLM | http://localhost:8000 |
MLX_BASE_URL | MLX | http://localhost:8080 |
Config File
nclaw looks for config.yaml in the current directory or $HOME/.nclaw/. Nested keys map to env vars with underscores (e.g. telegram.bot_token = NCLAW_TELEGRAM_BOT_TOKEN).
telegram:
bot_token: "your-telegram-bot-token"
whitelist_chat_ids: "123456789,987654321"
cli: "claude" # Options: claude, claudish, codex, copilot, gemini
data_dir: "/app/data"
db_path: "/app/data/nclaw.db"
timezone: "Europe/Berlin"
# Multi-model settings (setting model auto-selects multi-model backend)
model: "" # e.g. "[email protected]", "or@mistralai/mistral-large"
# Provider API keys are set as regular env vars (not in this file):
# OPENROUTER_API_KEY, GEMINI_API_KEY, OPENAI_API_KEY, etc.
webhook:
base_domain: "example.com"
port: ":3000"
Example .env
NCLAW_TELEGRAM_BOT_TOKEN=123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11
NCLAW_TELEGRAM_WHITELIST_CHAT_IDS=12345678,87654321
NCLAW_DATA_DIR=/data
# Multi-model (uncomment to use)
# [email protected]
# GEMINI_API_KEY=your-key
Helm Values
When deploying with the Helm chart, configuration is passed via Helm values:
| Parameter | Default | Description |
|---|---|---|
image.repository | ghcr.io/nickalie/nclaw | Docker image |
image.tag | Chart appVersion | Image tag |
env.dataDir | /app/data | Data directory inside container |
env.telegramBotToken | "" | Telegram bot token |
env.whitelistChatIds | "" | Comma-separated allowed chat IDs |
env.webhookBaseDomain | "" | Base domain for webhook URLs |
env.cli | "" | CLI agent: claude, claudish (multi-model), codex, copilot, or gemini (empty = image default) |
env.model | "" | Model for multi-model backend (e.g. [email protected]). Setting this auto-selects multi-model |
existingSecret | "" | Use existing secret for bot token (key: telegram-bot-token) |
claudeCredentialsSecret | "" | Secret with Claude credentials (key: credentials.json) |
codexCredentialsSecret | "" | Secret with Codex credentials (key: auth.json) |
copilotCredentialsSecret | "" | Secret with Copilot credentials (key: config.json) |
geminiCredentialsSecret | "" | Secret with Gemini credentials (key: oauth_creds.json) |
persistence.enabled | true | Enable persistent storage |
persistence.size | 1Gi | PVC size |
persistence.storageClass | "" | Storage class |
persistence.existingClaim | "" | Use existing PVC |
rbac.create | true | Create ServiceAccount and ClusterRoleBinding |
rbac.clusterRole | cluster-admin | ClusterRole to bind |
proxy.enabled | false | Enable HTTP proxy |
proxy.httpProxy | "" | HTTP_PROXY value |
proxy.httpsProxy | "" | HTTPS_PROXY value |
resources.requests.cpu | 100m | CPU request |
resources.requests.memory | 128Mi | Memory request |
resources.limits.cpu | 1000m | CPU limit |
resources.limits.memory | 2Gi | Memory limit |