Models
opencode uses the AI SDK and Models.dev to support for 75+ LLM providers and it supports running local models.
Providers
You can configure providers in your opencode config under the provider
section.
Defaults
Most popular providers are preloaded by default. If you’ve added the credentials for a provider through opencode auth login
, they’ll be available when you start opencode.
Custom
You can add custom providers by specifying the npm package for the provider and the models you want to use.
{ "$schema": "https://opencode.ai/config.json", "provider": { "openrouter": { "npm": "@openrouter/ai-sdk-provider", "name": "OpenRouter", "options": {}, "models": { "anthropic/claude-3.5-sonnet": { "name": "Claude 3.5 Sonnet" } } } }}
Local
To configure a local model, specify the npm package to use and the baseURL
.
{ "$schema": "https://opencode.ai/config.json", "provider": { "ollama": { "npm": "@ai-sdk/openai-compatible", "options": { "baseURL": "http://localhost:11434/v1" }, "models": { "llama2": {} } } }}
Select a model
If you have multiple models, you can select the model you want by typing in:
/models
Loading models
When opencode starts up, it checks for the following:
-
The model list in the opencode config.
opencode.json {"$schema": "https://opencode.ai/config.json","model": "anthropic/claude-sonnet-4-20250514"}The format here is
provider/model
. -
The last used model.
-
The first model using an internal priority.