SR
Get Started

AI Providers

An overview of all supported AI providers, their models, and capabilities.


CoPilot supports multiple AI providers out of the box. Each provider has different strengths, pricing, and model options. You can switch between providers at any time in the chat.

ProviderModelsWeb SearchBest For
OpenAI6YesCost-performance balance
Anthropic2YesHighest quality output
Google Gemini5NoBudget-friendly usage

OpenAI

OpenAI is the most widely used provider and offers a good balance between quality, speed, and cost. To get started, create an account at platform.openai.com and generate an API key.

Models
gpt-5.4 · gpt-5.4-mini · gpt-5.4-nano · gpt-4o · o3 · o4-mini
Web Search
Yes
Strengths
Fast responses, reliable tool calling, good all-rounder for content tasks
Weaknesses
Can be less creative than Anthropic for nuanced writing

Anthropic

Anthropic builds Claude, known for high-quality, nuanced text generation. Create an account at console.anthropic.com and generate an API key.

Models
claude-opus-4-7 · claude-opus-4-6 · claude-sonnet-4-6
Web Search
Yes
Strengths
Best writing quality, strong at following complex instructions, excellent for brand voice tasks
Weaknesses
Slower response times and higher cost per token compared to OpenAI and Gemini

Google Gemini

Google Gemini is the most affordable option with generous free tiers. Create an account at aistudio.google.com and generate an API key.

Models
gemini-3.1-pro-preview · gemini-3-flash-preview · gemini-3.1-flash-lite-preview · gemini-2.5-pro · gemini-2.5-flash
Web Search
No
Strengths
Lowest cost, fast response times, good for simple content tasks
Weaknesses
Less reliable for complex tool calling, no web search support

Langdock

Langdock is a DSGVO-compliant AI platform that routes requests to OpenAI, Anthropic, and Google Gemini through a single unified API. By installing the Langdock Provider Plugin, all CoPilot AI requests are routed through Langdock instead of directly to the providers. This gives you a single API key, EU or US data residency, and DSGVO-compliant usage of LLMs.

Installation
ddev composer require samuelreichor/craft-co-pilot-landock &&
ddev craft plugin/install co-pilot-landock
Models
All models from OpenAI, Anthropic, and Google Gemini available in your Langdock workspace
Data Residency
EU or US
Strengths
Single API key for all providers, DSGVO-compliant, centralized model management

Benchmark

We ran 9 real-world content scenarios (field editing, nested matrix, translations, entry creation, batch operations, multi-site, propagation, and search) against each provider. Here are the results:

Premium Models

Provider / ModelAvg. ScoreAvg. Duration
OpenAI (gpt-5.4)100%27.4s
Anthropic (claude-opus-4-6)100%52.5s
Gemini (gemini-3.1-pro-preview)96.3%58.0s

Budget Models

Provider / ModelAvg. ScoreAvg. Duration
OpenAI (o4-mini)87.8%149.8s
Anthropic (claude-sonnet-4-6)95.6%40.8s
Gemini (gemini-2.5-flash)91.9%23.3s

Note

Anthropic uses significantly fewer tokens per request due to prompt caching, which makes it very cost-efficient despite higher per-token pricing. Gemini's budget models offer the fastest response times. Scores reflect correctness of the final result across all scenarios.

Custom Providers

You can add your own AI provider by implementing the ProviderInterface. Head over to the Custom Providers guide for a full walkthrough.


Copyright © 2026 Samuel Reichör