If you've been looking for a self-hosted AI assistant that actually fits into your existing workflow — your terminal, your messaging apps, your infrastructure — this OpenClaw setup guide is exactly what you need. By the end of this post, you'll have OpenClaw running on your machine, connected to Telegram, and ready to take commands.
What is OpenClaw?
OpenClaw is an open-source AI personal assistant CLI that runs on your own hardware and connects to the AI models and messaging platforms you already use. Unlike cloud-only assistants, OpenClaw puts you in control — your data stays on your infrastructure, your keys are your own, and you can extend it with custom skills and automations. Think of it as a programmable, self-hosted Jarvis for developers and operators.
Prerequisites
- Node.js 18 or higher — Check with
node -v. Use nvm to upgrade if needed. - npm — Comes with Node.js. Verify with
npm -v. - A messaging account — Telegram is the easiest to start with.
- An AI API key — OpenClaw supports OpenAI, Anthropic, GitHub Copilot models, and others.
- Linux or macOS — Windows users can use WSL2.
Installation
Install via npm
npm install -g openclaw
Verify the install:
openclaw --version
If you get command not found, add npm's global bin to your PATH:
npm config get prefix
export PATH="$(npm config get prefix)/bin:$PATH"
Initialize the workspace
openclaw init
This scaffolds your workspace at ~/.openclaw/workspace with default config and memory files.
Basic Configuration
nano ~/.openclaw/config.yaml
Set your AI model
model:
provider: openai
name: gpt-4o
apiKey: sk-your-openai-api-key-here
Connecting to Telegram
Step 1: Create a Telegram Bot
- Open Telegram and message @BotFather
- Send
/newbotand follow the prompts - Copy the bot token BotFather gives you
Step 2: Add Telegram plugin to config
plugins:
telegram:
enabled: true
token: "7123456789:AAF_your_telegram_bot_token_here"
allowedUsers:
- your_telegram_username
Step 3: Start the gateway
openclaw gateway start
To run it persistently as a systemd service:
# /etc/systemd/system/openclaw.service
[Unit]
Description=OpenClaw Gateway
After=network.target
[Service]
ExecStart=/usr/bin/openclaw gateway start
Restart=always
User=youruser
Environment=NODE_ENV=production
[Install]
WantedBy=multi-user.target
sudo systemctl daemon-reload
sudo systemctl enable openclaw
sudo systemctl start openclaw
First Commands — Verifying It Works
Open Telegram, find your bot, and send it a message:
- Send
Hello— should get an AI response - Send
/status— shows session info, model, and runtime
Or test directly from CLI:
openclaw chat "What's in my workspace directory?"
Check gateway status and logs:
openclaw gateway status
openclaw gateway logs --tail 50
Tips and Next Steps
1. Write a SOUL.md
Drop a SOUL.md in your workspace to give your assistant a personality, tone, and priorities. OpenClaw reads it at every session start — treat it like a version-controlled system prompt.
2. Add custom Skills
Skills are SKILL.md files that give the assistant specialized instructions for specific tasks — Notion integration, GitHub workflows, weather lookups, or any repeatable automation.
3. Use MEMORY.md for persistent context
OpenClaw starts fresh each session but reads your workspace files as memory. Keep a MEMORY.md with ongoing tasks, key facts, and important context.
4. Lock it down
Always set allowedUsers in your plugin config. Treat your bot token like a password — rotate it immediately if ever exposed.
.nvmrc in your workspace to prevent version drift breaking things after a Node upgrade.
Need Help With Enterprise Deployment?
Setting up OpenClaw for a team, integrating it with your existing infrastructure, or building custom skills and automations at scale? We can help.
📚 More Openclaw Guides on Sysbrix