-
Notifications
You must be signed in to change notification settings - Fork 4k
feat: add tool prompt override support in .continuerc.json #9314
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: add tool prompt override support in .continuerc.json #9314
Conversation
Learn moreAll Green is an AI agent that automatically: ✅ Addresses code review comments ✅ Fixes failing CI checks ✅ Resolves merge conflicts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No issues found across 7 files
|
📚 Documentation PR created: #9315 I've created a documentation PR that adds usage examples and explains the new tool prompt override feature. The docs include accordion components showing how to:
The documentation follows the existing structure in |
|
@sestinj Just to personalize this a bit... I've been trying to love Every model is a little different, so making the tool prompts completely override-able is to me just a necessity. I tried to stick to your established patterns and touch as little as possible, and I think there is minimal drift. I hope you'll consider this a nice xmas gift! { |
|
Before & after for reference (with PR #9325 fix) (gpt-oss-20b)
|
RomneyDa
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@shanevcantwell the ability to override tool prompts would be a neat change. Currently JSON configuration is deprecated and we are very tentative to make changes that only work in JSON.
What if we put this under chatOptions in a model's YAML configuration? Or thoughts on other ways we could make this work for most users?
| * Configuration for overriding built-in tool prompts. | ||
| * Allows customization of tool descriptions and behavior at the repo level. | ||
| */ | ||
| export interface ToolOverride { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally use Partial<Pick<Tool, ....>> or similar for this type
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cheers, thanks for having a look. I missed that detail about JSON deprecation. No problem at all. I'll put the JSON implementation on a separate branch on my fork in case there is any use for it, and bring in your chatOptions suggestion to replace the PR code. I'll get right on getting it back through the CI.
Adds per-model tool prompt overrides under chatOptions in YAML config:
```yaml
models:
- name: my-model
chatOptions:
toolPromptOverrides:
run_terminal_command:
description: "Custom description"
view_diff:
disabled: true
```
This replaces the JSON-only implementation with YAML-only support as
requested by maintainers. The JSON implementation is preserved on the
`feature/tool-prompt-overrides-json` branch for reference.
Changes:
- Add toolOverrideSchema to chatOptionsSchema (packages/config-yaml)
- Add ToolOverride interface and toolPromptOverrides to LLMOptions (core)
- Store and apply overrides in BaseLLM.streamChat()
- Add applyToolOverrides utility with comprehensive tests
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
3f26b0d to
df3e417
Compare


Summary
Adds the ability to customize tool prompts via
.continuerc.jsonat the workspace level, plus fixes two bugs that were preventing this feature from working.ToolOverrideinterface for customizing tool descriptions, disabling tools, or modifying system message prompts.continuerc.jsonfiles (Bug: .continuerc.json files fail to load due to 'this' binding issue #9312).continuerc.jsonfiles (Bug: YAML config path does not load workspace .continuerc.json files #9313)Motivation
Local models (via LM Studio, Ollama, etc.) often struggle with Continue's default tool prompts, which include legacy syntax examples that confuse models into outputting raw
[TOOL_CALLS]BEGIN_ARG...text instead of using proper tool calling.This feature allows per-repo customization of tool prompts to work better with specific models.
Before (broken tool syntax output):
After (clean tool usage with custom prompts):
Example
.continuerc.json{ "mergeBehavior": "merge", "tools": [ { "name": "view_diff", "disabled": true }, { "name": "read_file", "description": "Read file contents. Use relative paths from workspace root.", "systemMessageDescription": { "prefix": "To read a file, use read_file with a relative path:" } } ] }Bug Fixes Included
Fix #9312:
.continuerc.jsonfiles fail to loadrcFiles.map(ide.readFile)lostthisbinding, causing silent failures. Fixed by usingrcFiles.map((uri) => ide.readFile(uri)).Fix #9313: YAML config path ignores
.continuerc.jsonloadContinueConfigFromYamlnever calledgetWorkspaceRcConfigs. Added conditional loading indoLoadConfig.tsso tool overrides work regardless of config format.Test plan
.continuerc.jsonloads with JSON config.continuerc.jsonloads with YAML configdisabled: trueremoves tool from listFixes #9312, Fixes #9313
🤖 Generated with Claude Code
Summary by cubic
Add per-model tool prompt overrides via YAML config to customize or disable tools, applied in BaseLLM during chat.
Written for commit df3e417. Summary will update on new commits.