OpenRouter
OpenRouter is a routing layer that lets one API key reach hundreds of models from dozens of providers — Anthropic, OpenAI, Google, Mistral, Meta, DeepSeek, plus open-weights hosted by various partners. Best fit when you want to compare models or use a model that doesn't have a first-class adapter yet.
What you need
- An OpenRouter account: openrouter.ai
- A funded balance (pay-as-you-go; minimum top-up is $5)
- An API key (starts with
sk-or-…)
Steps
- Top up. openrouter.ai/credits — add $5–$10 to start.
- Create the key. openrouter.ai/keys → Create Key. Optionally set a credit limit per-key so a runaway session can't drain your balance.
- Add it to Kenaz. Providers → Add provider → OpenRouter. Paste the key, Test, Save.
Models and what they're for
The full list lives at openrouter.ai/models — searchable, with per-model pricing and provider routing info. The model picker in Kenaz mirrors this list (it's pulled from the OpenRouter API on save).
A few patterns:
anthropic/claude-sonnet-4-6,openai/gpt-5,google/gemini-2.5-pro— direct models from the major labs. Same prices as direct, ±a small OpenRouter margin.:nitrosuffix — fastest available host for a given model.:floorsuffix — cheapest available host.openrouter/auto— OpenRouter picks for you.
Pricing
OpenRouter's margin is documented on openrouter.ai/docs/quickstart. For most models it's a flat ~5% over the upstream rate. Kenaz's Usage view shows the total spend per session.
Privacy posture
OpenRouter publishes per-provider data policies — some upstreams retain data, some don't. The Provider Routing page on each model lists the policy for each route.
If you need a hard guarantee no upstream stores your data:
- Filter to providers with
data_policy = "none"only — set this in the OpenRouter dashboard's privacy settings or per-key. - Or use the model's
:nostoresuffix where supported.
openrouter.ai/docs/features/privacy-and-logging is the canonical reference.
Troubleshooting
402 Insufficient credits— top up the balance.provider not available— OpenRouter routes to multiple hosts per model and some go down. Retry usually picks a different one. If a specific model is permanently unavailable, the model card on openrouter.ai shows route status.- Tool use behaves differently than direct. Some upstreams have spotty function-calling support behind OpenRouter. If tools are critical, either use the direct provider adapter (Anthropic, OpenAI) or stick to models that explicitly advertise
toolscapability in the model card.