Model Providers
OpenClaw can use many LLM providers. Pick a provider, authenticate, then set the default model asprovider/model.
Looking for chat channel docs (WhatsApp/Telegram/Discord/Slack/Mattermost (plugin)/etc.)? See Channels.
Quick start
- Authenticate with the provider (usually via
openclaw onboard). - Set the default model:
Provider docs
- Amazon Bedrock
- Anthropic (API + Claude Code CLI)
- Cloudflare AI Gateway
- GLM models
- Hugging Face (Inference)
- Kilocode
- LiteLLM (unified gateway)
- MiniMax
- Mistral
- Moonshot AI (Kimi + Kimi Coding)
- NVIDIA
- Ollama (local models)
- OpenAI (API + Codex)
- OpenCode Zen
- OpenRouter
- Qianfan
- Qwen (OAuth)
- Together AI
- Vercel AI Gateway
- Venice (Venice AI, privacy-focused)
- vLLM (local models)
- Xiaomi
- Z.AI
Transcription providers
Community tools
- Claude Max API Proxy - Community proxy for Claude subscription credentials (verify Anthropic policy/terms before use)