Quickstart
Get from zero to a working chat in about five minutes.
1. Install
Grab the build for your platform from docs.kameas.ai/download. macOS builds are signed and notarized, Windows builds are code-signed, Linux ships as a tarball.
- macOS
- Windows
- Linux
# After downloading the .zip, expand it and drag Kenaz.app into /Applications.
# First launch — open from Finder, double-click. It's notarized so Gatekeeper
# will let it through without the right-click "Open" dance.
# Expand the .zip and run Kenaz.exe.
# The binary is signed via Microsoft Trusted Signing — SmartScreen
# may still warn on first launch until reputation builds; click
# "More info → Run anyway" once.
tar -xzf kaneaz-harness-*-linux-*.tar.gz
cd kaneaz-harness-*-linux-*
./kenaz
A .desktop entry is included if you'd like to install system-wide:
sudo install -m 0755 kenaz /usr/local/bin/
sudo install -m 0644 kenaz.desktop /usr/share/applications/
sudo install -m 0644 kenaz.png /usr/share/icons/hicolor/1024x1024/apps/
sudo update-desktop-database
When Kenaz launches you'll see the onboarding flow. Walk through it; the only step that matters at this point is picking a provider.
2. Pick a provider
Kenaz needs an AI provider to talk to. You bring your own key — Kenaz stores it locally in your OS keychain, never on our servers.
The recommended starter providers, in roughly the order most people pick:
| Provider | Why pick it | Setup page |
|---|---|---|
| Anthropic | Best-in-class for tool use and long-form reasoning. Direct API. | Anthropic setup |
| OpenAI | Widely available, many models. Direct API. | OpenAI setup |
| OpenRouter | One key, many models from many providers. Good for evals. | OpenRouter setup |
| AWS Bedrock | If your org already has AWS and wants the data path to stay in AWS. | Bedrock setup |
| Ollama | Local models. No API key needed. Fast iteration, weaker output. | Ollama setup |
| Custom (BYO) | Any OpenAI-compatible endpoint — vLLM, LiteLLM, your enterprise gateway. | Custom endpoint setup |
You can have more than one configured. Switch between them per-session from the model picker in the chat header.
The five-minute path
If you have no preference, Anthropic is the fastest:
- Open console.anthropic.com → sign up or sign in.
- Go to Settings → API Keys → Create Key. Copy the value (starts with
sk-ant-…). - In Kenaz, open the Providers view from the left rail → Add provider → Anthropic.
- Paste the key. Pick a model (default is fine —
claude-sonnet-4-6covers most use cases). Click Save.
Kenaz writes the key to your OS keychain (Keychain Access on macOS, Credential Manager on Windows, Secret Service on Linux) and discards the plaintext.
3. Send your first message
- Click New session in the Sessions view.
- Type a prompt: "Summarize the file at /etc/hostname."
- Hit Send.
Kenaz will:
- Build the request with your prompt + the system prompt for tool use.
- Stream the response back from the provider.
- If the model wants to read
/etc/hostname, it'll call the read_file tool — which prompts you for permission before running. - The response, the tool call, and the tool result all land in the audit log before they're shown.
You're now in a session. From here:
- Branch the conversation: hover any assistant message → Branch from here.
- Attach a file: drag-and-drop, or click the paperclip in the input.
- Add a tool: open the Tools view → connect an MCP server.
4. Recommended next steps
- Tighten permissions for your taste. Permissions lets you pre-grant safe paths so the model doesn't ping you for every read.
- Set a default project. Sessions groups conversations by project — handy when you're juggling work threads.
- Bring your own MCP servers. Tools explains how to wire up the official ones (filesystem, GitHub, Slack, …) or your own.
- Read the audit log once so you know where it lives and what it captures. It's the load-bearing artifact behind the security pitch.
Troubleshooting
- "Provider not reachable" error after adding a key. Hit Test in the provider editor — it pings the provider with a tiny request and reports the actual error. Most often this is a typo'd key or a region/endpoint mismatch (Bedrock especially).
- Model isn't in the picker. Each provider gates which models are visible. For most providers Kenaz reads the available list from the provider's API on save; for Bedrock you'll need the model granted to your AWS account first (Bedrock setup explains).
- macOS won't open the app. If you downloaded an older unsigned build, run
xattr -d com.apple.quarantine /Applications/Kenaz.apponce. Current builds (anything from May 2026 or later) are notarized; Gatekeeper opens them cleanly. - Logs.
~/Library/Logs/Kenaz/(macOS),%APPDATA%\Kenaz\Logs\(Windows),~/.local/share/kenaz/logs/(Linux). Attach the latest one to a bug report.