Provider Configuration
Basilisk connects to LLMs through provider adapters powered by LiteLLM.
Supported Providers
| Provider | Flag | Env Variable |
|----------|------|-------------|
| OpenAI | -p openai | OPENAI_API_KEY |
| Anthropic | -p anthropic | ANTHROPIC_API_KEY |
| Google Gemini | -p google | GOOGLE_API_KEY |
| Azure OpenAI | -p azure | AZURE_API_KEY |
| AWS Bedrock | -p bedrock | AWS credentials |
| Ollama | -p ollama | (local, no key) |
| vLLM | -p vllm | (local, no key) |
| Custom HTTP | -p custom | N/A |
| Custom WebSocket | -p websocket | N/A |
Examples
# OpenAI
export OPENAI_API_KEY="sk-..."
basilisk scan -t https://api.target.com/chat -p openai
# Anthropic Claude
export ANTHROPIC_API_KEY="sk-ant-..."
basilisk scan -t https://api.target.com/chat -p anthropic
# Local Ollama
basilisk scan -t http://localhost:11434 -p ollama --model llama3
# Custom REST endpoint
basilisk scan -t https://api.internal.com/predict -p custom \
--custom-header "Authorization: Bearer token123"