BaihuBaihu
Providers

custom providers

any OpenAI-compatible API endpoint

usage

point baihu at any OpenAI-compatible endpoint:

# ~/.baihu/config.toml
default_provider = "custom:https://your-api.com/v1"
default_model = "your-model"
api_key = "your-key"

examples

google gemini (openai-compatible)

default_provider = "custom:https://generativelanguage.googleapis.com/v1beta/openai"
default_model = "gemini-2.5-flash"
api_key = "your-google-key"

together.ai

default_provider = "custom:https://api.together.xyz/v1"
default_model = "meta-llama/Llama-3-70b-chat-hf"
api_key = "your-together-key"

local vllm

default_provider = "custom:http://localhost:8000/v1"
default_model = "your-model"

requirements

the endpoint must implement the OpenAI chat completions API format (/chat/completions). tool use (function calling) is supported if the endpoint supports it.

SSRF protection

custom URLs are validated against private IP ranges before any request. this blocks 127.x, 10.x, 172.16-31.x, 192.168.x, 169.254.x, and IPv6 equivalents.

ollama is the only exception (explicitly exempt because it's local by design).

On this page