LLM Aggregator API
FlowDot exposes an OpenAI-compatible HTTP API. Point any OpenAI SDK at https://flowdot.ai/api/v1 with a Bearer fd_agg_… key and you can call any supported model — OpenAI, Anthropic, Google, Redpill, OpenRouter, and more — through a single endpoint. It's the same idea as OpenRouter.
TL;DR
- Base URL:
https://flowdot.ai/api/v1 - Auth:
Authorization: Bearer fd_agg_xxxxxxxxxxxxx - Get a key: Open any app page (e.g. the Dashboard) → click the settings gear in the top-right → FlowDot API Keys tab → create and manage keys there.
- Model IDs are nested:
openai/gpt-4o-mini,anthropic/claude-sonnet-4-6,redpill/google/gemini-2.5-flash-lite - Compatible with:
openai(Python),openai(Node), any HTTP client - OpenAPI spec: /api/learn-center/openapi.json
Quickstart
curl
curl https://flowdot.ai/api/v1/chat/completions \
-H "Authorization: Bearer fd_agg_xxxxxxxxxxxxx" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o-mini",
"messages": [{"role": "user", "content": "Hello from FlowDot"}]
}'
Python (openai SDK)
from openai import OpenAI
client = OpenAI(
api_key="fd_agg_xxxxxxxxxxxxx",
base_url="https://flowdot.ai/api/v1",
)
resp = client.chat.completions.create(
model="anthropic/claude-sonnet-4-6",
messages=[{"role": "user", "content": "Summarise FlowDot in one sentence."}],
)
print(resp.choices[0].message.content)
JavaScript / TypeScript (openai SDK)
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.FLOWDOT_AGG_KEY,
baseURL: 'https://flowdot.ai/api/v1',
});
const resp = await client.chat.completions.create({
model: 'redpill/google/gemini-2.5-flash-lite',
messages: [{ role: 'user', content: 'Hello from FlowDot' }],
});
console.log(resp.choices[0].message.content);
List available models
curl https://flowdot.ai/api/v1/models \
-H "Authorization: Bearer fd_agg_xxxxxxxxxxxxx"
Nested Model IDs
Model IDs are path-style, with the provider first. When an aggregator provider (like Redpill or OpenRouter) is the first segment, the rest of the path is forwarded upstream — hence the "double-nested" form.
| Example ID | Route |
|---|---|
openai/gpt-4o-mini | Direct to OpenAI |
anthropic/claude-sonnet-4-6 | Direct to Anthropic |
google/gemini-2.5-flash | Direct to Google |
redpill/google/gemini-2.5-flash-lite | FlowDot → Redpill → Google |
redpill/deepseek-ai/DeepSeek-V3 | FlowDot → Redpill → DeepSeek |
openrouter/meta-llama/llama-3.3-70b-instruct | FlowDot → OpenRouter → Meta |
Reasoning models (openai/o1, openai/o3, openai/gpt-5) accept max_completion_tokens rather than max_tokens and ignore temperature. The SDK handles this for you.
Call GET /api/v1/models for the authoritative live list of IDs you can use.
Three Funding Paths
You can pay for inference on FlowDot in three ways, independently:
| Mechanism | Who pays upstream | Where it applies |
|---|---|---|
| BYOK — Bring Your Own Key | You, directly on your provider account | Any surface. Store provider keys in the settings modal under the LLM BYOK tab (and Audio BYOK for TTS/STT). Requests routed through your key are not charged FlowDot credits. |
| OAuth-connected provider account | Your own ChatGPT / Claude subscription | flowdot-cli and the FlowDot Native desktop app only. Lets you authorise your personal subscription once; tokens are stored locally. |
FlowDot Credits (default for /api/v1/*) |
FlowDot, reimbursed from your prepaid balance | Any surface, including the aggregator API. Buy credits at flowdot.ai/credits. Per-token pricing is data-driven — see GET /api/v1/models for available models. |
All three can be mixed on one account — pick whichever makes sense per-model or per-integration.
Settings Modal — where each thing lives
Every user-level setting on FlowDot is managed from one place. Open any app page (Dashboard, Workflows, Agent, etc.) and click the settings gear icon in the top-right. The settings modal opens with these tabs:
| Tab | What it does |
|---|---|
| FlowDot Credits | View your credit balance, transaction history, and buy more credits. Credits fund /api/v1/* calls when you aren't using BYOK or an OAuth-connected provider account. |
| FlowDot API Keys | Create, rename, rotate, and revoke fd_agg_… keys for the OpenAI-compatible aggregator at /api/v1. Plaintext is shown once per key. |
| MCP Tokens | Create fd_mcp_… tokens for the Platform / MCP API (/api/mcp/v1, /api/hub/*) — workflows, apps, custom nodes, knowledge base, agent chat. |
| Notifications | Email / in-app notification preferences. |
| LLM BYOK | Store your own provider API keys (OpenAI, Anthropic, Google, etc.). When a workflow / aggregator call resolves to a model whose provider has a BYOK key here, the call is billed to your provider account and no FlowDot credits are consumed. |
| Audio BYOK | Same idea as LLM BYOK but for TTS / STT providers (e.g. ElevenLabs, Deepgram). |
| Preferred Models | Default model choices for the "Simple / Capable / Complex" tiers used by LLM nodes and agents. |
| Research | Preferences for deep-research / web-research features. |
Streaming
Pass "stream": true. The response is Server-Sent Events — data: {...}\n\n chunks terminated by data: [DONE], identical in shape to OpenAI's streaming response.
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.FLOWDOT_AGG_KEY,
baseURL: 'https://flowdot.ai/api/v1',
});
const stream = await client.chat.completions.create({
model: 'openai/gpt-4o-mini',
messages: [{ role: 'user', content: 'Write a haiku about bees.' }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content ?? '');
}
Tool Calling
The aggregator accepts OpenAI's tools / tool_choice shape and returns tool_calls on the assistant message. Passthrough behaviour depends on upstream support — OpenAI, Anthropic, and most Redpill / OpenRouter models all work; a handful of smaller open models do not.
const resp = await client.chat.completions.create({
model: 'anthropic/claude-sonnet-4-6',
messages: [{ role: 'user', content: 'What is the weather in Paris?' }],
tools: [{
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather for a city',
parameters: {
type: 'object',
properties: { city: { type: 'string' } },
required: ['city'],
},
},
}],
tool_choice: 'auto',
});
console.log(resp.choices[0].message.tool_calls);
Errors
The aggregator returns OpenAI-shaped error objects. The ones specific to FlowDot:
| HTTP | Code | Meaning |
|---|---|---|
| 401 | invalid_api_key | Missing / malformed / revoked fd_agg_ token. |
| 402 | insufficient_credits | Estimated cost (with 20% safety buffer) exceeds your credit balance. Top up at /credits or switch to BYOK. |
| 429 | model_rate_limited | Upstream provider rate-limited the request. Honour the Retry-After header. |
| 502 | upstream_error | Upstream provider failed. Credit is not deducted for failed calls. |
All Aggregator Endpoints
All require Authorization: Bearer fd_agg_….
| Method | Path | Purpose |
|---|---|---|
| POST | /api/v1/chat/completions | Chat completions (streaming + non-streaming) |
| GET | /api/v1/models | List all models |
| GET | /api/v1/models/{model} | Model detail |
| POST | /api/v1/embeddings | Embeddings |
| POST | /api/v1/audio/speech | Text-to-speech |
| POST | /api/v1/audio/transcriptions | Speech-to-text |
| POST | /api/v1/audio/translations | Audio translation to English |
| POST | /api/v1/images/generations | Image generation |
Dashboard-only key management (auth:sanctum, session)
These are not callable with an fd_agg_ token — they're for the web dashboard flow.
| Method | Path | Purpose |
|---|---|---|
| GET / POST | /api/aggregator/keys | List / create keys (POST returns the plaintext key once) |
| GET / PUT / DELETE | /api/aggregator/keys/{id} | Show / rename / revoke |
| GET | /api/aggregator/keys/{id}/usage | Per-key usage stats |
| POST | /api/aggregator/keys/{id}/regenerate | Rotate |
| GET | /api/user/credits/balance | Current credit balance |
| GET | /api/user/credits/transactions | Credit ledger |