Skip to content

Getting Started

The fastest way to run Clawbolt is with Docker Compose.

  • Docker and Docker Compose
  • A messaging channel: either an iMessage backend (Linq for hosted iMessage/RCS/SMS, or BlueBubbles for self-hosted iMessage) or a Telegram bot token
  • An LLM provider API key (OpenAI, Anthropic, etc.)
Terminal window
git clone https://github.com/mozilla-ai/clawbolt.git
cd clawbolt
Terminal window
cp .env.example .env

Edit .env and fill in the required credentials. See Configuration for full details.

At minimum you need:

  • An LLM API key (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.)
  • VISION_MODEL — the model used for image analysis (defaults to LLM_MODEL if not set)
  • At least one messaging channel configured:
    • iMessage (recommended): choose one backend. Hosted iMessage/RCS/SMS via Linq (LINQ_API_TOKEN + LINQ_FROM_NUMBER), or self-hosted iMessage via BlueBubbles (BLUEBUBBLES_SERVER_URL + BLUEBUBBLES_PASSWORD). The app surfaces whichever backend you configure as a single “iMessage” channel to end users. Configuring both at once is not supported.
    • Telegram: TELEGRAM_BOT_TOKEN and TELEGRAM_ALLOWED_CHAT_ID for Telegram via Telegram Setup
Terminal window
docker compose up --build

This will:

  • Start PostgreSQL for data storage
  • Build the app image (Python 3.11, ffmpeg for audio processing)
  • Run database migrations automatically
  • Start the FastAPI server on port 8000
Terminal window
curl http://localhost:8000/api/health
# {"status":"ok"}

Docker Compose starts a Cloudflare Tunnel alongside the app and registers webhooks automatically. No Cloudflare account or auth token required.

Send a message to your assistant and Clawbolt will respond. If you configured an iMessage backend, send an iMessage (or SMS/RCS if Linq is your backend) to the configured phone number or address. If you configured Telegram, message your bot on Telegram.