Providers
ollama
local models with ollama
setup
- install ollama
- pull a model:
ollama pull llama3.2 - configure baihu:
# ~/.baihu/config.toml
default_provider = "ollama"
default_model = "llama3.2"
no API key needed. ollama runs on http://localhost:11434 by default.
notes
- ollama is intentionally exempt from SSRF validation (it's local by design)
- supports any model available in the ollama registry
- tool use support depends on the model
custom ollama host
OLLAMA_HOST=http://192.168.1.100:11434 baihu agent