Skip to main content

Providers

Ante is provider-agnostic. Each provider implements a common interface for sending prompts and receiving streaming responses. Providers are resolved from a catalog at session init time.

Built-in providers

ProviderWire FormatModels
AnthropicMessages APIClaude family (haiku-4-5, sonnet-4-5, sonnet-4-6, opus-4-6)
Anthropic SubscriptionMessages APIClaude family (OAuth)
OpenAIResponses APIGPT-5 family
OpenAI CompatibleChat CompletionsCustom models
OpenAI SubscriptionResponses APIGPT-5 family (OAuth via Codex)
GeminiGemini APIGemini 3 family
Vertex AI GeminiGemini APIGemini 3 family
Grok (xAI)Responses APIGrok 4
ZaiOpenAI-compatibleGLM-4.7
Open RouterOpenAI-compatibleMultiple providers
AntixOpenAI-compatibleClaude, GPT via Antigma (OAuth)
LocalOpenAI-compatibleGGUF models

Provider identifiers

IDProvider
anthropicAnthropic (Claude)
anthropic-subscriptionAnthropic Subscription (Pro/Max)
openaiOpenAI (GPT)
openai-compatibleOpenAI Compatible
openai-subscriptionOpenAI Subscription
geminiGoogle Gemini
vertex-geminiVertex AI Gemini
openrouterOpen Router
xaiGrok (xAI)
zaiZai
antixAntix
localLocal models via llama.cpp

Selecting a provider

You can set your provider in three ways (in order of precedence):

  1. CLI flagante --provider anthropic --model claude-sonnet-4-5
  2. Settings file — Set provider and model in ~/.ante/settings.json
  3. Built-in default — Anthropic with Claude Sonnet

Authentication

ProviderAuth Method
AnthropicANTHROPIC_API_KEY env var or OAuth
Anthropic SubscriptionOAuth
OpenAIOPENAI_API_KEY env var or OAuth
OpenAI CompatibleOPENAI_COMPATIBLE_API_KEY env var
OpenAI SubscriptionOAuth
GeminiGEMINI_API_KEY env var
Vertex AI GeminiVERTEX_GEMINI_API_KEY env var
GrokXAI_API_KEY env var
ZaiZ_AI_API_KEY env var
Open RouterOPENROUTER_API_KEY env var
AntixOAuth
LocalNo authentication needed
tip

Anthropic, OpenAI, and Antix also support interactive OAuth flows through the TUI.

Model details

Anthropic (Claude)

ante --provider anthropic --model claude-sonnet-4-5

Available models: claude-haiku-4-5, claude-sonnet-4-5, claude-sonnet-4-6, claude-opus-4-6

OpenAI

ante --provider openai --model gpt-5.4

Available models: gpt-5.4, gpt-5.4-pro, gpt-5.3-codex, gpt-5-mini, gpt-5-nano

Google Gemini

ante --provider gemini --model gemini-3-flash-preview

Vertex AI Gemini

ante --provider vertex-gemini --model gemini-3-flash-preview

Grok (xAI)

ante --provider xai --model grok-4-latest

Zai

ante --provider zai --model glm-4.7

Open Router

ante --provider openrouter --model deepseek/deepseek-reasoner

Antix

ante --provider antix --model claude-sonnet-4-5

Local models

Run GGUF models locally via the built-in llama.cpp engine. No API keys or internet required. See Offline Mode for setup details.

ante --provider local

Standalone models

ModelDescriptionContext
qwen3.5-plusMost powerful Qwen visual model1M
qwen-plusFast and capable1M
kimi-k2.5Kimi K2.5 reasoning model262K
MiniMax-M2.5MiniMax M2.5 model200K

Custom catalog

Create ~/.ante/catalog.json to add or override providers and models:

{
"providers": {
"my-provider": {
"name": "my-provider",
"display_name": "My Provider",
"base_url": "https://api.example.com/v1",
"auth": { "bearer": { "env_key": "MY_PROVIDER_API_KEY" } },
"wire_style": "OpenAiCompatible"
}
},
"models": {
"my-model": {
"name": "my-model",
"description": "Custom model",
"max_tokens": 32000,
"context_limit": 200000,
"thinking": "Enabled"
}
}
}

Supported wire styles: AnthropicMessage, OpenAiCompatible, OpenAiResponse, Gemini.

Supported thinking modes: Disabled, Enabled, Deep, Max.

Third-party & custom providers

Using Open Router

  1. Sign up at openrouter.ai and generate an API key.
  2. Set your API key:
    export OPENROUTER_API_KEY="sk-or-..."
  3. Use any model from Open Router's model list:
    ante --provider openrouter --model deepseek/deepseek-reasoner

OpenAI-compatible providers

Many LLM providers expose an OpenAI-compatible API (e.g., Together AI, Fireworks, Groq Cloud, Perplexity):

  1. Set the base URL:
    export MODEL_BASE_URL="https://api.together.xyz/v1"
  2. Set your API key:
    export OPENAI_COMPATIBLE_API_KEY="your-provider-api-key"
  3. Run with the OpenAI Compatible provider:
    ante --provider openai-compatible --model meta-llama/Llama-3-70b-chat-hf
tip

When using third-party providers, make sure the model supports tool use (function calling). Ante relies on tool use for its agent capabilities.

warning

Not all models work equally well as coding agents. If you experience issues, try a larger or more capable model.