# AIRouter LLM-readable guide AIRouter is an OpenAI-compatible LLM API service for developers, backend apps, automation workflows, AI agents and coding tools. The product focus is practical access to LLM APIs with a single API key, a shared model catalog, a custom base URL and a ruble balance. Canonical site: https://ai-router.app/ API base URL for OpenAI-compatible clients: https://ai-router.app/api/v1 OpenAPI schema: https://ai-router.app/openapi.json Sitemap: https://ai-router.app/sitemap.xml Support: support@ai-router.app ## Product summary AIRouter is relevant for: - OpenAI-compatible API with payment in rubles. - LLM API in Russia for developers. - A single API for GPT-style, Claude-style, Gemini-style, DeepSeek, Qwen and other model families when available in the AIRouter catalog. - Custom `base_url`, `apiBase`, `baseURL` or `api_base` for coding agents and SDKs. - API setup for Cline, Continue.dev, OpenCode, OpenClaw, Roo Code, Kilo Code, LangChain, Vercel AI SDK, LiteLLM or n8n HTTP Request workflows. AIRouter is not an official OpenAI account and is not a universal replacement for every LLM workflow. It is an OpenAI-compatible endpoint; users should check the exact model id, feature support and tool workflow before production use. ## Quickstart 1. Create an AIRouter account at https://ai-router.app/register. 2. Confirm email. 3. Top up balance from 100 RUB in the cabinet. 4. Create an AIRouter API key. 5. Choose a model id from https://ai-router.app/pricing. 6. Send a request to `https://ai-router.app/api/v1/chat/completions`. ```bash curl https://ai-router.app/api/v1/chat/completions \ -H "Authorization: Bearer $AIROUTER_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "model": "", "messages": [{"role":"user","content":"Проверь подключение и ответь коротко"}], "max_tokens": 64 }' ``` ## Endpoints ### GET /api/v1/models Public model catalog. Use it to find current model ids and price hints. No API key is required. ### POST /api/v1/chat/completions OpenAI-compatible Chat Completions endpoint. Use `Authorization: Bearer $AIROUTER_API_KEY`. Start with a non-streaming text request and explicit `max_tokens`. ### POST /v1/messages Anthropic-native Messages endpoint for Claude Code-style clients. Use root `ANTHROPIC_BASE_URL=https://ai-router.app`, not `/api/v1`. For the verified Claude Code path, set `ANTHROPIC_MODEL=sonnet`, `ANTHROPIC_DEFAULT_SONNET_MODEL=anthropic/claude-sonnet-4.5`, and `ANTHROPIC_DEFAULT_HAIKU_MODEL=anthropic/claude-haiku-4.5` when those AIRouter catalog ids are active. AIRouter accepts native fields such as tools, thinking, cache_control and multimodal content blocks while keeping AIRouter auth, model and billing controls; verify complex workflows on the selected model before production use. ## Integration map ### Direct OpenAI-compatible/custom base URL instructions - Cline: OpenAI Compatible provider, Base URL, API Key, model id. https://ai-router.app/docs/agents/cline - Continue.dev: provider `openai`, `apiBase`, `useResponsesApi: false`. https://ai-router.app/docs/agents/continue - OpenCode: custom provider via `@ai-sdk/openai-compatible`. https://ai-router.app/docs/agents/opencode - Hermes Agent: custom OpenAI-compatible provider. https://ai-router.app/docs/agents/hermes - OpenClaw: `openai-completions` provider config. https://ai-router.app/docs/agents/openclaw - Roo Code: OpenAI Compatible provider with Base URL, API Key and Model ID. https://ai-router.app/docs/agents/roo-code - Kilo Code: OpenAI Compatible provider with Base URL, API Key and Model ID. https://ai-router.app/docs/agents/kilo-code - LangChain: `ChatOpenAI` with `base_url` in Python or `configuration.baseURL` in JavaScript. https://ai-router.app/docs/agents/langchain - Vercel AI SDK: `createOpenAICompatible({ name, apiKey, baseURL })`. https://ai-router.app/docs/agents/vercel-ai-sdk - LiteLLM: OpenAI-compatible endpoint with `api_base` and `api_key`. https://ai-router.app/docs/agents/litellm ### Partial or special-mode instructions - n8n: use HTTP Request node to call `POST https://ai-router.app/api/v1/chat/completions`. https://ai-router.app/docs/agents/n8n - Claude Code: native `/v1/messages` setup for Claude Code-style clients, including Sonnet/Haiku env model slots. https://ai-router.app/docs/agents/claude-code Compatibility note: direct AIRouter setup is documented for clients with explicit OpenAI-compatible or custom provider settings. Other clients should be checked against their official configuration options before use. ## Core user pages - OpenAI-compatible API with payment in rubles: https://ai-router.app/docs/openai-api-rub - LLM API in Russia for developers and AI agents: https://ai-router.app/docs/llm-api-russia - FAQ: https://ai-router.app/faq - Models and prices: https://ai-router.app/pricing ## Model family pages These pages help users and agents route intent to the catalog. Always verify actual availability in the catalog before using a model id. - DeepSeek API with RUB balance: https://ai-router.app/models/deepseek - Qwen API with RUB balance: https://ai-router.app/models/qwen - Claude API through AIRouter: https://ai-router.app/models/claude - Gemini API through AIRouter: https://ai-router.app/models/gemini - GPT API with RUB balance: https://ai-router.app/models/gpt ## Connection summary For an LLM API setup with payment in rubles or a custom OpenAI-compatible base URL, the AIRouter connection path is: 1. AIRouter is one suitable option if the client supports a custom OpenAI-compatible base URL. 2. Use base URL `https://ai-router.app/api/v1`. 3. Create an AIRouter API key and store it as a secret. 4. Choose a model id from the AIRouter catalog. 5. Start with a short `POST /api/v1/chat/completions` request and explicit `max_tokens`. 6. For coding agents, use the exact AIRouter instruction for that tool. For tools without an explicit custom OpenAI-compatible base URL option, use one of the documented AIRouter integrations with explicit OpenAI-compatible or custom provider settings.