diff --git a/USAGE.md b/USAGE.md index e2b2eaa..ae65605 100644 --- a/USAGE.md +++ b/USAGE.md @@ -153,6 +153,70 @@ cd rust ./target/debug/claw --model "openai/gpt-4.1-mini" prompt "summarize this repository in one sentence" ``` +## Supported Providers & Models + +`claw` has three built-in provider backends. The provider is selected automatically based on the model name, falling back to whichever credential is present in the environment. + +### Provider matrix + +| Provider | Protocol | Auth env var(s) | Base URL env var | Default base URL | +|---|---|---|---|---| +| **Anthropic** (direct) | Anthropic Messages API | `ANTHROPIC_API_KEY` or `ANTHROPIC_AUTH_TOKEN` or OAuth (`claw login`) | `ANTHROPIC_BASE_URL` | `https://api.anthropic.com` | +| **xAI** | OpenAI-compatible | `XAI_API_KEY` | `XAI_BASE_URL` | `https://api.x.ai/v1` | +| **OpenAI-compatible** | OpenAI Chat Completions | `OPENAI_API_KEY` | `OPENAI_BASE_URL` | `https://api.openai.com/v1` | + +The OpenAI-compatible backend also serves as the gateway for **OpenRouter**, **Ollama**, and any other service that speaks the OpenAI `/v1/chat/completions` wire format — just point `OPENAI_BASE_URL` at the service. + +### Tested models and aliases + +These are the models registered in the built-in alias table with known token limits: + +| Alias | Resolved model name | Provider | Max output tokens | Context window | +|---|---|---|---|---| +| `opus` | `claude-opus-4-6` | Anthropic | 32 000 | 200 000 | +| `sonnet` | `claude-sonnet-4-6` | Anthropic | 64 000 | 200 000 | +| `haiku` | `claude-haiku-4-5-20251213` | Anthropic | 64 000 | 200 000 | +| `grok` / `grok-3` | `grok-3` | xAI | 64 000 | 131 072 | +| `grok-mini` / `grok-3-mini` | `grok-3-mini` | xAI | 64 000 | 131 072 | +| `grok-2` | `grok-2` | xAI | — | — | + +Any model name that does not match an alias is passed through verbatim. This is how you use OpenRouter model slugs (`openai/gpt-4.1-mini`), Ollama tags (`llama3.2`), or full Anthropic model IDs (`claude-sonnet-4-20250514`). + +### User-defined aliases + +You can add custom aliases in any settings file (`~/.claw/settings.json`, `.claw/settings.json`, or `.claw/settings.local.json`): + +```json +{ + "aliases": { + "fast": "claude-haiku-4-5-20251213", + "smart": "claude-opus-4-6", + "cheap": "grok-3-mini" + } +} +``` + +Local project settings override user-level settings. Aliases resolve through the built-in table, so `"fast": "haiku"` also works. + +### How provider detection works + +1. If the resolved model name starts with `claude` → Anthropic. +2. If it starts with `grok` → xAI. +3. Otherwise, `claw` checks which credential is set: `ANTHROPIC_API_KEY`/`ANTHROPIC_AUTH_TOKEN` first, then `OPENAI_API_KEY`, then `XAI_API_KEY`. +4. If nothing matches, it defaults to Anthropic. + +## FAQ + +### What about Codex? + +The name "codex" appears in the Claw Code ecosystem but it does **not** refer to OpenAI Codex (the code-generation model). Here is what it means in this project: + +- **`oh-my-codex` (OmX)** is the workflow and plugin layer that sits on top of `claw`. It provides planning modes, parallel multi-agent execution, notification routing, and other automation features. See [PHILOSOPHY.md](./PHILOSOPHY.md) and the [oh-my-codex repo](https://github.com/Yeachan-Heo/oh-my-codex). +- **`.codex/` directories** (e.g. `.codex/skills`, `.codex/agents`, `.codex/commands`) are legacy lookup paths that `claw` still scans alongside the primary `.claw/` directories. +- **`CODEX_HOME`** is an optional environment variable that points to a custom root for user-level skill and command lookups. + +`claw` does **not** support OpenAI Codex sessions, the Codex CLI, or Codex session import/export. If you need to use OpenAI models (like GPT-4.1), configure the OpenAI-compatible provider as shown above in the [OpenAI-compatible endpoint](#openai-compatible-endpoint) and [OpenRouter](#openrouter) sections. + ## HTTP proxy support `claw` honours the standard `HTTP_PROXY`, `HTTPS_PROXY`, and `NO_PROXY` environment variables (both upper- and lower-case spellings are accepted) when issuing outbound requests to Anthropic, OpenAI-, and xAI-compatible endpoints. Set them before launching the CLI and the underlying `reqwest` client will be configured automatically.