diff --git a/rust/README.md b/rust/README.md index eff924b..40aa637 100644 --- a/rust/README.md +++ b/rust/README.md @@ -35,6 +35,78 @@ Or authenticate via OAuth: claw login ``` +## Providers & Auth Support Matrix + +Before anything else: **know which branch you're building.** Provider +support differs between `dev/rust` and `main`, and neither branch +currently supports AWS Bedrock, Google Vertex AI, or Azure OpenAI. + +### Supported on `dev/rust` (this branch) + +| Provider | Protocol | Auth env var(s) | Base URL env var | Default base URL | +|---|---|---|---|---| +| **Anthropic** (direct) | Anthropic Messages API | `ANTHROPIC_API_KEY` or `ANTHROPIC_AUTH_TOKEN` or OAuth (`claw login`) | `ANTHROPIC_BASE_URL` | `https://api.anthropic.com` | + +That's it. On `dev/rust`, the `api` crate has a single provider +backend (`rust/crates/api/src/client.rs`) wired directly to +Anthropic's Messages API. There is no `providers/` module, no +auto-routing by model prefix, and no OpenAI-compatible adapter. If you +export `OPENAI_API_KEY`, `XAI_API_KEY`, or `DASHSCOPE_API_KEY` on this +branch, claw will ignore them and still fail with `MissingApiKey` +because it only looks at `ANTHROPIC_*`. + +### Additionally supported on `main` + +`main` has a multi-provider routing layer under +`rust/crates/api/src/providers/` that `dev/rust` does not yet carry. +If you need any of these, build from `main` and wait for the routing +work to land on `dev/rust`: + +| Provider | Protocol | Auth env var | Default base URL | +|---|---|---|---| +| **xAI** (Grok) | OpenAI-compatible | `XAI_API_KEY` | `https://api.x.ai/v1` | +| **OpenAI** | OpenAI Chat Completions | `OPENAI_API_KEY` | `https://api.openai.com/v1` | +| **DashScope** (Alibaba Qwen) | OpenAI-compatible | `DASHSCOPE_API_KEY` | `https://dashscope.aliyuncs.com/compatible-mode/v1` | + +Any service that speaks the OpenAI `/v1/chat/completions` wire format +also works by pointing `OPENAI_BASE_URL` at it (OpenRouter, Ollama, +local LLM proxies, etc.). + +On `main`, the provider is selected automatically by model-name prefix +(`claude` → Anthropic, `grok` → xAI, `openai/` or `gpt-` → OpenAI, +`qwen/` or `qwen-` → DashScope) before falling through to whichever +credential is present. Prefix routing wins over env-var presence, so +setting `ANTHROPIC_API_KEY` will not silently hijack an +`openai/gpt-4.1-mini` request. + +### Not supported anywhere in this repo (yet) + +These are the provider backends people reasonably expect to work but +which **do not have any code path** in either `dev/rust` or `main` as +of this commit. Setting the corresponding cloud SDK env vars will not +make them work — there is nothing to wire them into. + +| Provider | Why it doesn't work today | +|---|---| +| **AWS Bedrock** | No SigV4 signer, no Bedrock-specific request adapter, no `AWS_*` credential path in the api crate. Bedrock's Claude endpoint is wire-compatible with a different request envelope than direct Anthropic and would need a dedicated backend. | +| **Google Vertex AI (Anthropic on Vertex)** | No Google auth (service account / ADC) path, no Vertex-specific base URL adapter. Vertex publishes Claude models under a `projects//locations//publishers/anthropic/models/:streamRawPredict` URL shape that requires a separate route. | +| **Azure OpenAI** | OpenAI wire format but uses `api-version` query params, `api-key` header (not `Authorization: Bearer`), and deployment-name routing instead of model IDs. The `main`-branch OpenAI-compatible adapter assumes upstream OpenAI semantics and won't round-trip Azure deployments cleanly. | +| **Google AI Studio (Gemini)** | Different request shape entirely; not OpenAI-compatible and not Anthropic-compatible. Would need its own backend. | + +If you need one of these, the honest answer today is: use a proxy that +speaks Anthropic or OpenAI on its public side and translates to +Bedrock/Vertex/Azure/Gemini internally. Setting `ANTHROPIC_BASE_URL` +or (on `main`) `OPENAI_BASE_URL` at a translation proxy is the +supported escape hatch until first-class backends land. + +### When auth fails + +If you see `ANTHROPIC_AUTH_TOKEN or ANTHROPIC_API_KEY is not set` on +`dev/rust` after setting, say, `OPENAI_API_KEY`, that's not a bug — +it's this branch telling you honestly that it doesn't yet know how to +talk to OpenAI. Either build from `main`, or export +`ANTHROPIC_API_KEY`, or run `claw login` to use Anthropic OAuth. + ## Features | Feature | Status | diff --git a/rust/crates/api/src/error.rs b/rust/crates/api/src/error.rs index 2c31691..3ef30c3 100644 --- a/rust/crates/api/src/error.rs +++ b/rust/crates/api/src/error.rs @@ -52,9 +52,23 @@ impl Display for ApiError { fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result { match self { Self::MissingApiKey => { + // Intentionally explicit about what this branch does and + // does not support so users who exported OPENAI_API_KEY, + // XAI_API_KEY, DASHSCOPE_API_KEY, AWS_*, or Google + // service-account credentials get an immediate "aha" + // instead of assuming a misconfiguration bug. See + // rust/README.md § "Providers & Auth Support Matrix" + // for the full matrix and the branch differences. write!( f, - "ANTHROPIC_AUTH_TOKEN or ANTHROPIC_API_KEY is not set; export one before calling the Anthropic API" + "ANTHROPIC_AUTH_TOKEN or ANTHROPIC_API_KEY is not set; export one before calling the Anthropic API. \ + On this branch (`dev/rust`) only Anthropic is wired up \ + — OPENAI_API_KEY, XAI_API_KEY, DASHSCOPE_API_KEY, and \ + AWS/Google credentials are ignored. Multi-provider \ + routing (OpenAI, xAI, DashScope) lives on `main`; AWS \ + Bedrock, Google Vertex AI, and Azure OpenAI are not \ + supported on any branch yet. See rust/README.md \ + § 'Providers & Auth Support Matrix' for details." ) } Self::ExpiredOAuthToken => { @@ -132,3 +146,46 @@ impl From for ApiError { Self::InvalidApiKeyEnv(value) } } + +#[cfg(test)] +mod tests { + use super::*; + + #[test] + fn missing_api_key_display_lists_supported_and_unsupported_providers_and_points_at_readme() { + // given + let error = ApiError::MissingApiKey; + + // when + let rendered = format!("{error}"); + + // then — the message must keep the grep-stable core so CI + // parsers and docs that quote the exact substring continue to + // resolve, AND it must tell the user which env vars are + // ignored on this branch and where to find the full matrix. + assert!( + rendered.contains("ANTHROPIC_AUTH_TOKEN or ANTHROPIC_API_KEY is not set"), + "grep-stable prefix must remain intact, got: {rendered}" + ); + assert!( + rendered.contains("OPENAI_API_KEY"), + "should explicitly call out that OPENAI_API_KEY is ignored on dev/rust, got: {rendered}" + ); + assert!( + rendered.contains("XAI_API_KEY"), + "should explicitly call out that XAI_API_KEY is ignored on dev/rust, got: {rendered}" + ); + assert!( + rendered.contains("DASHSCOPE_API_KEY"), + "should explicitly call out that DASHSCOPE_API_KEY is ignored on dev/rust, got: {rendered}" + ); + assert!( + rendered.contains("Bedrock") && rendered.contains("Vertex") && rendered.contains("Azure"), + "should tell users Bedrock/Vertex/Azure are not supported on any branch, got: {rendered}" + ); + assert!( + rendered.contains("rust/README.md"), + "should point users at the README matrix for the full story, got: {rendered}" + ); + } +}