BaihuBaihu
Providers

providers overview

22+ AI providers with a pluggable trait system

how providers work

every provider implements the Provider trait. swap providers with a config change, zero code changes.

#[async_trait]
pub trait Provider: Send + Sync {
    async fn chat(&self, messages: &[Message], tools: &[Tool]) -> Result<Response>;
    fn name(&self) -> &str;
}

supported providers

providerconfig namemodels
OpenRouteropenrouterclaude, gpt-4, llama, gemini, etc.
Anthropicanthropicclaude-3.5-sonnet, claude-3-opus
OpenAIopenaigpt-4o, gpt-4-turbo
Ollamaollamaany local model
Groqgroqllama, mixtral
Mistralmistralmistral-large, mistral-medium
xAIxaigrok
DeepSeekdeepseekdeepseek-chat, deepseek-coder
Fireworksfireworksllama, mixtral
Perplexityperplexitysonar models
Coherecoherecommand-r
Cloudflare AIcloudflareworkers AI models
AWS Bedrockbedrockclaude, titan
Customcustom:https://...any OpenAI-compatible API

reliability

all providers use:

  • exponential backoff with ±25% random jitter to prevent thundering herd
  • DashMap response cache with 60s TTL so identical prompts don't burn API credits
  • SSRF validation on provider URLs (blocks private IP ranges)

switching providers

# via CLI flag
baihu agent -p anthropic -m "hello"

# via config
default_provider = "openrouter"

# via env var
BAIHU_PROVIDER=ollama baihu agent

On this page