Model Providers

Imposter is designed to be provider-agnostic. Whether you want the total privacy of local models or the raw power of cloud-scale LLMs, connecting them is a simple, one-time process.

BYOK ArchitectureZero Data LoggingLocal-First Storage
01

Ollama (Local AI)

Running models locally is the ultimate way to use Imposter. Your data never leaves your RAM, and the response time is near-instant (depending on your hardware).

How to Setup

  • 1.Install Ollama from ollama.com.
  • 2.Run ollama pull llama3 in your terminal.
  • 3.Imposter will auto-detect all installed models on startup.

Pro Tip: If you want to add a model manually (e.g., using a custom base URL), use the Model Management panel in Settings.

Add New Intelligence
Ollama (Local)
e.g. My Personal Llama
llama3
http://127.0.0.1:11434
Add to Fleet
02

OpenRouter (Cloud Hub)

Add New Intelligence
OpenRouter (Cloud)
Qwen3.6 Plus
qwen/qwen3.6-plus:free
•••••••••••••
Add to Fleet

OpenRouter is the standard for accessing hundreds of cloud models through a single API key. It's perfect if you don't want to manage local hardware.

How to Setup

  • 1.Go to openrouter.ai and create an account.
  • 2.Generate an API Key in your dashboard.
  • 3.Search for a "Free" model and copy its Model ID.
  • 4.Paste the ID and Key into Imposter — everything is saved locally.
03

Google Gemini (Direct)

Direct integration with Google Gemini via Google AI Studio. This provides the most stable and feature-rich experience for Gemini models without intermediaries.

Verification Flow

  • 1.Get your API Key from Google AI Studio.
  • 2.Paste the key and click "Verify Models".
  • 3.Imposter will ping Google's API, verify your key status, and automatically list every model available for that key.
  • 4.Select your preferred model (e.g., Gemini 2.5 Flash) and save.
Add New Intelligence
Google Gemini (Direct)
e.g. My Personal Llama
Gemini 2.5 Flash
Gemini 2.5 Flash
Gemma 3 1B
Gemma 3 4B
Verifying 11/34...
•••••••••••••
Verify Models
Add to Fleet
?

Common Issues

Ollama models not showing up?

Ensure the Ollama application is running in your system tray. Click 'Refresh Models' in Imposter Settings.

Gemini verification failed?

Double check your API key for extra spaces. Ensure your region is supported by Google AI Studio.

OpenRouter model returns 404?

Ensure the Model ID is exactly as shown on OpenRouter (e.g. 'anthropic/claude-3-haiku').

CORS or Network errors?

Imposter handles CORS automatically via the Main process. Check your firewall settings if issues persist.