Setting Up LLM Providers (14+ Options)

Unlock unlimited AI power by connecting your own API keys. Learn about FlowDot's provider system, model tiers, and how to use local AI with Ollama.

Setting Up LLM Providers

FlowDot supports 14+ AI providers out of the box. This tutorial shows you how to connect your own API keys to unlock more powerful models and unlimited usage.

Why Connect Your Own Keys?

FlowDot Free Credits Your Own Keys
Limited tokens Unlimited usage
Cost-effective models Any model you choose
Great for learning Production-ready

Supported Providers

FlowDot integrates with all major AI providers:

Provider Models Specialty
OpenAI GPT-4, GPT-4o, o1 General purpose
Anthropic Claude 3.5, Claude 4 Long context, reasoning
Google Gemini Pro, Gemini Ultra Multimodal
Mistral Mistral Large, Codestral European, efficient
Cohere Command R+ Enterprise
Groq Llama, Mixtral Fast inference
OpenRouter 100+ models Aggregator
Ollama Local models Privacy, offline
xAI Grok Reasoning
DeepSeek DeepSeek V3 Coding, math
Together AI Open source models Cost-effective
Perplexity Sonar Search-enhanced
Fireworks Various Fast inference
And more... Always adding Community requests

Step 1: Add an API Key

  1. Click the hamburger menu (three lines) in the top right
  2. Navigate to API Settings
  3. Click the LLM API Keys tab
  4. Click Add Provider
  5. Select your provider (e.g., OpenAI)
  6. Paste your API key
  7. Click Test to verify it works
  8. Click Save

Where to Get API Keys

Provider URL Free Tier?
OpenAI platform.openai.com Pay as you go
Anthropic console.anthropic.com Pay as you go
Google ai.google.dev Free tier available
Groq console.groq.com Generous free tier
OpenRouter openrouter.ai Pay as you go

Step 2: Configure Model Tiers

FlowDot uses a tier system for quick model selection:

Tier Purpose Example
FlowDot Free platform credits Cost-effective model
Simple Fast, basic tasks GPT-4o-mini, Haiku
Capable Most use cases GPT-4o, Sonnet
Complex Advanced reasoning Claude Opus, o1

Setting Your Preferred Models

  1. Go to API Settings > Preferred Models
  2. For each tier (Simple, Capable, Complex):
    • Select a provider from the dropdown
    • Select a model from that provider
  3. Click Save

Example Configuration:

Tier Provider Model
Simple OpenAI gpt-4o-mini
Capable Anthropic claude-3-5-sonnet
Complex Anthropic claude-opus-4-5

Step 3: Use Models in Workflows

Quick Select Buttons

Every LLM node shows four buttons:

[FlowDot] [Simple] [Capable] [Complex]

Click any button to use that tier's configured model.

In the Editor

  1. Select an LLM Query node
  2. Click the tier button for your desired model
  3. The node uses your preferred model for that tier

In the Dashboard

  1. Open a workflow's dashboard
  2. Click LLM Setup in the toolbar
  3. See all LLM nodes in the workflow
  4. Change tiers for any node

Using Local Models with Ollama

Run AI completely locally for privacy and offline use.

Setup Ollama

  1. Install Ollama from ollama.ai
  2. Pull a model: ollama pull llama3.2
  3. Ollama runs on localhost:11434

Connect to FlowDot

  1. Go to API Settings > LLM API Keys
  2. FlowDot automatically detects Ollama if running
  3. You'll see "Connected - X models found"
  4. Use Ollama models in any LLM node

Recommended Local Models

Model Size Best For
llama3.2 3B Fast, general use
llama3.1 8B Better quality
codestral 22B Coding tasks
mixtral 47B Complex reasoning

Step 4: Using Models in Different Contexts

Workflow Editor

The tier buttons on each node let you quickly switch:

Editor AI Assistant

The AI chat in the editor uses your tier settings:

  1. Click the AI Assistant icon (top right)
  2. Select a tier for your conversation
  3. Ask for help building workflows

Agent Conversations

The Agent also uses your model preferences:


Cost Optimization Tips

  1. Start with FlowDot for development and testing
  2. Use Simple for straightforward transformations
  3. Reserve Complex for tasks that truly need it
  4. Mix tiers in workflows - use Simple for preprocessing, Complex for the main task
  5. Check OpenRouter for cost comparisons across providers

Troubleshooting

"API Key Invalid"

"Model Not Found"

Ollama Not Detected


Summary

FlowDot's flexible provider system lets you:

Next, learn about the AI Editor Assistant to build workflows with natural language!

Related Tutorials

Back to Tutorials