Setting Up LLM Providers (14+ Options)
Unlock unlimited AI power by connecting your own API keys. Learn about FlowDot's provider system, model tiers, and how to use local AI with Ollama.
Setting Up LLM Providers
FlowDot supports 14+ AI providers out of the box. This tutorial shows you how to connect your own API keys to unlock more powerful models and unlimited usage.
Why Connect Your Own Keys?
| FlowDot Free Credits | Your Own Keys |
|---|---|
| Limited tokens | Unlimited usage |
| Cost-effective models | Any model you choose |
| Great for learning | Production-ready |
Supported Providers
FlowDot integrates with all major AI providers:
| Provider | Models | Specialty |
|---|---|---|
| OpenAI | GPT-4, GPT-4o, o1 | General purpose |
| Anthropic | Claude 3.5, Claude 4 | Long context, reasoning |
| Gemini Pro, Gemini Ultra | Multimodal | |
| Mistral | Mistral Large, Codestral | European, efficient |
| Cohere | Command R+ | Enterprise |
| Groq | Llama, Mixtral | Fast inference |
| OpenRouter | 100+ models | Aggregator |
| Ollama | Local models | Privacy, offline |
| xAI | Grok | Reasoning |
| DeepSeek | DeepSeek V3 | Coding, math |
| Together AI | Open source models | Cost-effective |
| Perplexity | Sonar | Search-enhanced |
| Fireworks | Various | Fast inference |
| And more... | Always adding | Community requests |
Step 1: Add an API Key
- Click the hamburger menu (three lines) in the top right
- Navigate to API Settings
- Click the LLM API Keys tab
- Click Add Provider
- Select your provider (e.g., OpenAI)
- Paste your API key
- Click Test to verify it works
- Click Save
Where to Get API Keys
| Provider | URL | Free Tier? |
|---|---|---|
| OpenAI | platform.openai.com | Pay as you go |
| Anthropic | console.anthropic.com | Pay as you go |
| ai.google.dev | Free tier available | |
| Groq | console.groq.com | Generous free tier |
| OpenRouter | openrouter.ai | Pay as you go |
Step 2: Configure Model Tiers
FlowDot uses a tier system for quick model selection:
| Tier | Purpose | Example |
|---|---|---|
| FlowDot | Free platform credits | Cost-effective model |
| Simple | Fast, basic tasks | GPT-4o-mini, Haiku |
| Capable | Most use cases | GPT-4o, Sonnet |
| Complex | Advanced reasoning | Claude Opus, o1 |
Setting Your Preferred Models
- Go to API Settings > Preferred Models
- For each tier (Simple, Capable, Complex):
- Select a provider from the dropdown
- Select a model from that provider
- Click Save
Example Configuration:
| Tier | Provider | Model |
|---|---|---|
| Simple | OpenAI | gpt-4o-mini |
| Capable | Anthropic | claude-3-5-sonnet |
| Complex | Anthropic | claude-opus-4-5 |
Step 3: Use Models in Workflows
Quick Select Buttons
Every LLM node shows four buttons:
[FlowDot] [Simple] [Capable] [Complex]
Click any button to use that tier's configured model.
In the Editor
- Select an LLM Query node
- Click the tier button for your desired model
- The node uses your preferred model for that tier
In the Dashboard
- Open a workflow's dashboard
- Click LLM Setup in the toolbar
- See all LLM nodes in the workflow
- Change tiers for any node
Using Local Models with Ollama
Run AI completely locally for privacy and offline use.
Setup Ollama
- Install Ollama from ollama.ai
- Pull a model:
ollama pull llama3.2 - Ollama runs on
localhost:11434
Connect to FlowDot
- Go to API Settings > LLM API Keys
- FlowDot automatically detects Ollama if running
- You'll see "Connected - X models found"
- Use Ollama models in any LLM node
Recommended Local Models
| Model | Size | Best For |
|---|---|---|
| llama3.2 | 3B | Fast, general use |
| llama3.1 | 8B | Better quality |
| codestral | 22B | Coding tasks |
| mixtral | 47B | Complex reasoning |
Step 4: Using Models in Different Contexts
Workflow Editor
The tier buttons on each node let you quickly switch:
- Use Simple for data transformation, JSON parsing
- Use Capable for most AI tasks
- Use Complex for reasoning, analysis, creative work
Editor AI Assistant
The AI chat in the editor uses your tier settings:
- Click the AI Assistant icon (top right)
- Select a tier for your conversation
- Ask for help building workflows
Agent Conversations
The Agent also uses your model preferences:
- Simple, Capable, Complex buttons in the Agent UI
- Switch mid-conversation to change model power
Cost Optimization Tips
- Start with FlowDot for development and testing
- Use Simple for straightforward transformations
- Reserve Complex for tasks that truly need it
- Mix tiers in workflows - use Simple for preprocessing, Complex for the main task
- Check OpenRouter for cost comparisons across providers
Troubleshooting
"API Key Invalid"
- Ensure no extra spaces when pasting
- Check the key hasn't expired
- Verify you have credits/quota with the provider
"Model Not Found"
- Some models require specific access (e.g., GPT-4 requires API access approval)
- Refresh the model list after adding a new key
Ollama Not Detected
- Ensure Ollama is running:
ollama serve - Check it's accessible at
localhost:11434 - Refresh the LLM API Keys page
Summary
FlowDot's flexible provider system lets you:
- Use free credits to learn
- Connect 14+ providers for production
- Run local models for privacy
- Mix and match models across tiers
Next, learn about the AI Editor Assistant to build workflows with natural language!