opencode-lmstudio-sync is an OpenCode plugin that discovers the models exposed by a local LM Studio server and merges them into the OpenCode runtime configuration at startup.
It keeps user-authored model entries intact, fills in missing metadata when available, and avoids writing back to the user's config file by default.
- Discovers authoritative model IDs from LM Studio
GET /v1/models - Optionally enriches model names and context limits from
GET /api/v1/models - Creates or completes the
lmstudioprovider at runtime when needed - Preserves user-defined model names and context limits by default
- Supports optional prune and refresh flags for plugin-generated entries
- Uses short in-memory caching to avoid repeated network calls
- Works on Windows, macOS, and Linux
The plugin logic is cross-platform.
- Windows: supported. The probe order checks
127.0.0.1:15432andlocalhost:15432first, which matches common LM Studio setups on Windows. - macOS: supported. If OpenCode loads the plugin from a protected folder such as
Documents, the app may need explicit file access in macOS privacy settings. - Linux: supported. If OpenCode is installed through Flatpak or Snap, filesystem sandboxing may require additional access to the plugin directory.
Use this after publishing the package to npm.
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["opencode-lmstudio-sync@latest"]
}Optional explicit LM Studio base URL:
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["opencode-lmstudio-sync@latest"],
"provider": {
"lmstudio": {
"options": {
"baseURL": "http://127.0.0.1:15432/v1"
}
}
}
}Full explicit provider:
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["opencode-lmstudio-sync@latest"],
"provider": {
"lmstudio": {
"npm": "@ai-sdk/openai-compatible",
"name": "LM Studio (local)",
"options": {
"baseURL": "http://127.0.0.1:15432/v1"
}
}
}
}- Clone the repository.
- Install dependencies.
- Build the project.
- Add a small local loader file in your OpenCode plugin directory.
Build commands:
npm install
npm run buildLocal loader example:
CONFIG_HOME="${XDG_CONFIG_HOME:-$HOME/.config}"
OPENCODE_DIR="$CONFIG_HOME/opencode"
PLUGIN_DIR="/absolute/path/to/opencode-lmstudio-sync"
mkdir -p "$OPENCODE_DIR/plugins"
cat > "$OPENCODE_DIR/plugins/opencode-lmstudio-sync.js" <<EOF2
export { LMStudioPlugin } from "$PLUGIN_DIR/dist/index.js";
EOF2If your OpenCode installation uses a different config root, place the loader file inside that installation's plugin directory instead.
The plugin resolves the LM Studio base URL in this order:
provider.lmstudio.options.baseURLLM_STUDIO_BASE_URL- Probing these bases in order:
http://127.0.0.1:15432http://localhost:15432http://127.0.0.1:1234http://localhost:1234http://127.0.0.1:8080http://localhost:8080http://127.0.0.1:11434http://localhost:11434
Accepted values are normalized to <base>/v1.
LM_STUDIO_BASE_URL: use an explicit LM Studio server URLLM_STUDIO_API_TOKEN: optional bearer token for/api/v1/modelsLMSTUDIO_PRUNE_MISSING=1: remove plugin-generated models that are no longer returned by/v1/modelsLMSTUDIO_REFRESH_EXISTING=1: refresh metadata for existing plugin-generated entries
GET /v1/modelsis the source of truth for which models appear in OpenCode.GET /api/v1/modelsis optional enrichment only.- User-defined
namevalues are preserved. - User-defined
limit.contextvalues are preserved unless refresh is explicitly enabled for plugin-generated entries. - The plugin tracks generated models in memory instead of adding custom marker fields to the config.
- The plugin mutates the runtime config object and does not write to the user's config file.
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["opencode-lmstudio-sync@latest"],
"provider": {
"lmstudio": {
"npm": "@ai-sdk/openai-compatible",
"name": "LM Studio (local)",
"options": {
"baseURL": "http://127.0.0.1:15432/v1"
},
"models": {
"google/gemma-3n-e4b": {
"name": "Gemma 3n-e4b",
"organizationOwner": "google",
"limit": {
"context": 8192
}
}
}
}
}
}npm install
npm run build
npm run typecheck
npm test- LM Studio is not detected: make sure the local server is running and exposing the OpenAI-compatible API.
- Models do not appear: verify that
GET /v1/modelsreturns the expected IDs. - Metadata is missing: the plugin still works without
/api/v1/models, but display names and context limits may be less detailed. - macOS cannot load the plugin from
Documents: give OpenCode access to that folder or move the project to a less restricted path. - Linux sandboxed app cannot read the plugin directory: grant filesystem access or move the project to a path visible to the app.
MIT