Providers
providers overview
5 built-in providers + any OpenAI-compatible API
how providers work
every provider implements the Provider trait. swap providers with a config change, zero code changes.
#[async_trait]
pub trait Provider: Send + Sync {
async fn chat(&self, messages: &[Message], tools: &[Tool]) -> Result<Response>;
fn name(&self) -> &str;
}
built-in providers
| provider | config name | models |
|---|---|---|
| OpenRouter | openrouter | 200+ models — claude, gpt-4o, llama 4, gemini 2.5, etc. |
| Anthropic | anthropic | claude-sonnet-4-5, claude-opus-4, claude-haiku-3-5 |
| OpenAI | openai | gpt-4o, o3, o4-mini |
| Ollama | ollama | any local model |
| Custom | custom:https://... | any OpenAI-compatible API |
custom provider examples
any service with an OpenAI-compatible API works out of the box via the custom provider — Groq, Mistral, xAI, DeepSeek, Together, Fireworks, Perplexity, Cohere, and more. see custom providers for setup examples.
reliability
all providers use:
- exponential backoff with ±25% random jitter to prevent thundering herd
- DashMap response cache with 60s TTL so identical prompts don't burn API credits
- SSRF validation on provider URLs (blocks private IP ranges)
switching providers
# via CLI flag
baihu agent -p anthropic -m "hello"
# via config
default_provider = "openrouter"
# via env var
BAIHU_PROVIDER=ollama baihu agent