Problem:
- `rust/README.md` on `dev/rust` only shows `ANTHROPIC_API_KEY` under
Configuration and says nothing about other providers, so users
asking "does AWS Bedrock work? other API keys?" have no answer and
assume silent support.
- `ApiError::MissingApiKey` Display message says "Anthropic API" but
gives no signal to a user who exported `OPENAI_API_KEY` (or
`AWS_ACCESS_KEY_ID`) that their key is being ignored because this
branch has no code path for that provider.
- The multi-provider routing work (providers/anthropic.rs,
providers/openai_compat.rs, prefix routing for openai/, qwen/, etc.)
landed on `main` but has not yet merged into `dev/rust`, so the
support matrix actually differs between branches. Nothing in dev
docs communicates that.
Changes:
1. `rust/README.md`: new "Providers & Auth Support Matrix" section
between Configuration and Features, split into three sub-sections:
a. Supported on `dev/rust` (this branch) — just Anthropic. Explicit
call-out that OPENAI_API_KEY / XAI_API_KEY / DASHSCOPE_API_KEY
are ignored here because the `providers/` module does not exist
on this branch.
b. Additionally supported on `main` — xAI, OpenAI, DashScope, with
a note that model-name prefix routing (`openai/`, `gpt-`,
`qwen/`, `qwen-`) wins over env-var presence.
c. Not supported anywhere in this repo (yet) — AWS Bedrock, Google
Vertex AI, Azure OpenAI, Google Gemini, each with a one-line
"why it doesn't work today" pointing at the concrete code gap
(no SigV4 signer, no Google ADC path, api-version query params,
etc.) so users don't chase phantom config knobs. Proxy-based
escape hatch documented.
2. `rust/crates/api/src/error.rs`: `MissingApiKey` Display message now
keeps the grep-stable prefix ("ANTHROPIC_AUTH_TOKEN or
ANTHROPIC_API_KEY is not set") AND tells the user exactly which
other env vars are ignored on this branch plus which providers
aren't supported on any branch yet, with a pointer at the README
matrix section. Non-breaking change — the variant is still a unit
struct, no callers need to change.
3. New regression test
`missing_api_key_display_lists_supported_and_unsupported_providers_and_points_at_readme`
in `rust/crates/api/src/error.rs` asserts the grep-stable prefix is
preserved AND that OPENAI_API_KEY, XAI_API_KEY, DASHSCOPE_API_KEY,
Bedrock, Vertex, Azure, and rust/README.md all appear in the
rendered message, so future tweaks cannot silently drop the
user-facing signal without breaking CI.
Verification:
- `cargo build --release -p api` clean
- `cargo test --release -p api` 26 unit + 6 integration = 32 passing
- New regression test passes
- `cargo fmt -p api` clean
- `cargo clippy --release -p api` clean
Note: workspace-wide `cargo test` shows 11 pre-existing
`rusty-claude-cli` failures on clean `dev/rust` HEAD caused by tests
reading `~/.claude/plugins/installed/sample-hooks-bundled/` from the
host home directory instead of an isolated test fixture. These are
environment-leak test brittleness, not caused by this PR (verified by
stashing changes and re-running — failures reproduce on unmodified
HEAD). Filing as a separate ROADMAP pinpoint.
Does not close any open issue (issues are disabled on the repo);
addresses Clawhip dogfood nudge from 2026-04-08 about users asking
"other api keys? AWS Bedrock도 되냐" without a clear matrix.
Co-authored-by: gaebal-gajae <gaebal-gajae@layofflabs.com>
8.7 KiB
🦞 Claw Code — Rust Implementation
A high-performance Rust rewrite of the Claw Code CLI agent harness. Built for speed, safety, and native tool execution.
Quick Start
# Build
cd rust/
cargo build --release
# Run interactive REPL
./target/release/claw
# One-shot prompt
./target/release/claw prompt "explain this codebase"
# With specific model
./target/release/claw --model sonnet prompt "fix the bug in main.rs"
Configuration
Set your API credentials:
export ANTHROPIC_API_KEY="sk-ant-..."
# Or use a proxy
export ANTHROPIC_BASE_URL="https://your-proxy.com"
Or authenticate via OAuth:
claw login
Providers & Auth Support Matrix
Before anything else: know which branch you're building. Provider
support differs between dev/rust and main, and neither branch
currently supports AWS Bedrock, Google Vertex AI, or Azure OpenAI.
Supported on dev/rust (this branch)
| Provider | Protocol | Auth env var(s) | Base URL env var | Default base URL |
|---|---|---|---|---|
| Anthropic (direct) | Anthropic Messages API | ANTHROPIC_API_KEY or ANTHROPIC_AUTH_TOKEN or OAuth (claw login) |
ANTHROPIC_BASE_URL |
https://api.anthropic.com |
That's it. On dev/rust, the api crate has a single provider
backend (rust/crates/api/src/client.rs) wired directly to
Anthropic's Messages API. There is no providers/ module, no
auto-routing by model prefix, and no OpenAI-compatible adapter. If you
export OPENAI_API_KEY, XAI_API_KEY, or DASHSCOPE_API_KEY on this
branch, claw will ignore them and still fail with MissingApiKey
because it only looks at ANTHROPIC_*.
Additionally supported on main
main has a multi-provider routing layer under
rust/crates/api/src/providers/ that dev/rust does not yet carry.
If you need any of these, build from main and wait for the routing
work to land on dev/rust:
| Provider | Protocol | Auth env var | Default base URL |
|---|---|---|---|
| xAI (Grok) | OpenAI-compatible | XAI_API_KEY |
https://api.x.ai/v1 |
| OpenAI | OpenAI Chat Completions | OPENAI_API_KEY |
https://api.openai.com/v1 |
| DashScope (Alibaba Qwen) | OpenAI-compatible | DASHSCOPE_API_KEY |
https://dashscope.aliyuncs.com/compatible-mode/v1 |
Any service that speaks the OpenAI /v1/chat/completions wire format
also works by pointing OPENAI_BASE_URL at it (OpenRouter, Ollama,
local LLM proxies, etc.).
On main, the provider is selected automatically by model-name prefix
(claude → Anthropic, grok → xAI, openai/ or gpt- → OpenAI,
qwen/ or qwen- → DashScope) before falling through to whichever
credential is present. Prefix routing wins over env-var presence, so
setting ANTHROPIC_API_KEY will not silently hijack an
openai/gpt-4.1-mini request.
Not supported anywhere in this repo (yet)
These are the provider backends people reasonably expect to work but
which do not have any code path in either dev/rust or main as
of this commit. Setting the corresponding cloud SDK env vars will not
make them work — there is nothing to wire them into.
| Provider | Why it doesn't work today |
|---|---|
| AWS Bedrock | No SigV4 signer, no Bedrock-specific request adapter, no AWS_* credential path in the api crate. Bedrock's Claude endpoint is wire-compatible with a different request envelope than direct Anthropic and would need a dedicated backend. |
| Google Vertex AI (Anthropic on Vertex) | No Google auth (service account / ADC) path, no Vertex-specific base URL adapter. Vertex publishes Claude models under a projects/<proj>/locations/<loc>/publishers/anthropic/models/<model>:streamRawPredict URL shape that requires a separate route. |
| Azure OpenAI | OpenAI wire format but uses api-version query params, api-key header (not Authorization: Bearer), and deployment-name routing instead of model IDs. The main-branch OpenAI-compatible adapter assumes upstream OpenAI semantics and won't round-trip Azure deployments cleanly. |
| Google AI Studio (Gemini) | Different request shape entirely; not OpenAI-compatible and not Anthropic-compatible. Would need its own backend. |
If you need one of these, the honest answer today is: use a proxy that
speaks Anthropic or OpenAI on its public side and translates to
Bedrock/Vertex/Azure/Gemini internally. Setting ANTHROPIC_BASE_URL
or (on main) OPENAI_BASE_URL at a translation proxy is the
supported escape hatch until first-class backends land.
When auth fails
If you see ANTHROPIC_AUTH_TOKEN or ANTHROPIC_API_KEY is not set on
dev/rust after setting, say, OPENAI_API_KEY, that's not a bug —
it's this branch telling you honestly that it doesn't yet know how to
talk to OpenAI. Either build from main, or export
ANTHROPIC_API_KEY, or run claw login to use Anthropic OAuth.
Features
| Feature | Status |
|---|---|
| Anthropic API + streaming | ✅ |
| OAuth login/logout | ✅ |
| Interactive REPL (rustyline) | ✅ |
| Tool system (bash, read, write, edit, grep, glob) | ✅ |
| Web tools (search, fetch) | ✅ |
| Sub-agent orchestration | ✅ |
| Todo tracking | ✅ |
| Notebook editing | ✅ |
| CLAUDE.md / project memory | ✅ |
| Config file hierarchy (.claude.json) | ✅ |
| Permission system | ✅ |
| MCP server lifecycle | ✅ |
| Session persistence + resume | ✅ |
| Extended thinking (thinking blocks) | ✅ |
| Cost tracking + usage display | ✅ |
| Git integration | ✅ |
| Markdown terminal rendering (ANSI) | ✅ |
| Model aliases (opus/sonnet/haiku) | ✅ |
| Slash commands (/status, /compact, /clear, etc.) | ✅ |
| Hooks (PreToolUse/PostToolUse) | 🔧 Config only |
| Plugin system | 📋 Planned |
| Skills registry | 📋 Planned |
Model Aliases
Short names resolve to the latest model versions:
| Alias | Resolves To |
|---|---|
opus |
claude-opus-4-6 |
sonnet |
claude-sonnet-4-6 |
haiku |
claude-haiku-4-5-20251213 |
CLI Flags
claw [OPTIONS] [COMMAND]
Options:
--model MODEL Set the model (alias or full name)
--dangerously-skip-permissions Skip all permission checks
--permission-mode MODE Set read-only, workspace-write, or danger-full-access
--allowedTools TOOLS Restrict enabled tools
--output-format FORMAT Output format (text or json)
--version, -V Print version info
Commands:
prompt <text> One-shot prompt (non-interactive)
login Authenticate via OAuth
logout Clear stored credentials
init Initialize project config
doctor Check environment health
self-update Update to latest version
Slash Commands (REPL)
| Command | Description |
|---|---|
/help |
Show help |
/status |
Show session status (model, tokens, cost) |
/cost |
Show cost breakdown |
/compact |
Compact conversation history |
/clear |
Clear conversation |
/model [name] |
Show or switch model |
/permissions |
Show or switch permission mode |
/config [section] |
Show config (env, hooks, model) |
/memory |
Show CLAUDE.md contents |
/diff |
Show git diff |
/export [path] |
Export conversation |
/session [id] |
Resume a previous session |
/version |
Show version |
Workspace Layout
rust/
├── Cargo.toml # Workspace root
├── Cargo.lock
└── crates/
├── api/ # Anthropic API client + SSE streaming
├── commands/ # Shared slash-command registry
├── compat-harness/ # TS manifest extraction harness
├── runtime/ # Session, config, permissions, MCP, prompts
├── rusty-claude-cli/ # Main CLI binary (`claw`)
└── tools/ # Built-in tool implementations
Crate Responsibilities
- api — HTTP client, SSE stream parser, request/response types, auth (API key + OAuth bearer)
- commands — Slash command definitions and help text generation
- compat-harness — Extracts tool/prompt manifests from upstream TS source
- runtime —
ConversationRuntimeagentic loop,ConfigLoaderhierarchy,Sessionpersistence, permission policy, MCP client, system prompt assembly, usage tracking - rusty-claude-cli — REPL, one-shot prompt, streaming display, tool call rendering, CLI argument parsing
- tools — Tool specs + execution: Bash, ReadFile, WriteFile, EditFile, GlobSearch, GrepSearch, WebSearch, WebFetch, Agent, TodoWrite, NotebookEdit, Skill, ToolSearch, REPL runtimes
Stats
- ~20K lines of Rust
- 6 crates in workspace
- Binary name:
claw - Default model:
claude-opus-4-6 - Default permissions:
danger-full-access
License
See repository root.