A lightweight, cross-platform AI chat CLI built with Go. Supports multiple providers, streaming responses, file attachments, and an interactive terminal UI.
- Multi-provider — OpenAI, OpenAI Responses API, Anthropic, Gemini, Vertex AI, and OpenClaw, with custom base URL support
- MCP tool support — connect external MCP tool servers (filesystem, GitHub, databases, etc.) and let AI providers use them during chat
- Interactive model selection — arrow-key navigation with filtering
- Streaming responses — real-time token output with loading spinner
- Markdown highlighting — inline ANSI styling for headings, bold, italic, code, tables, and code blocks in streaming output
- File attachments — send images, PDFs, and text files alongside messages with Tab-completion for file paths
- Non-interactive mode — single message in, response out, pipe-friendly
- Conversation history — full context maintained within a session
- System prompt — set via flag or interactive input
- Config file — persistent API keys, default models, custom provider aliases, and MCP server definitions via
~/.chatchain.yaml - Styled terminal output — color-coded prompts
brew tap joyqi/tap
brew install chatchaingo install github.com/joyqi/chatchain@latestgit clone https://github.com/joyqi/chatchain.git
cd chatchain
go build -o chatchain .ChatChain provides a Claude Code plugin, allowing you to call other LLMs directly within Claude Code.
# Add the marketplace
/plugin marketplace add joyqi/chatchain
# Install the plugin
/plugin install chatchain@chatchain-marketplaceSlash command — manually ask another LLM:
/chatchain:ask openai gpt-4o "What is the meaning of life?"
/chatchain:ask anthropic claude-sonnet-4-20250514 "Explain monads"
/chatchain:ask gemini "Write a haiku"
Agent skill — Claude automatically uses ChatChain when you ask it to query another LLM:
> Use chatchain to ask GPT what is 1+1
> Ask Gemini to explain quicksort via chatchain
claude --plugin-dir ./chatchain-pluginchatchain [openai|anthropic|gemini|vertexai|openresponses|openclaw] [flags]| Flag | Short | Description |
|---|---|---|
--key |
-k |
API key (or set via env var) |
--url |
-u |
Custom base URL |
--model |
-M |
Model name (skip interactive selection) |
--temperature |
-t |
Sampling temperature, 0.0-2.0 (omit to use provider default) |
--message |
-m |
Send a single message and print the response (non-interactive, use - to read from stdin) |
--system |
-s |
System prompt |
--system-input |
-S |
Enter system prompt interactively |
--list |
-l |
List configured providers, or models for a given provider |
--mcp |
MCP server (command string or URL, repeatable) | |
--config |
-c |
Path to config file (default: ~/.chatchain.yaml) |
--verbose |
-v |
Print HTTP request/response bodies for debugging |
| Variable | Provider |
|---|---|
OPENAI_API_KEY |
OpenAI / OpenResponses |
ANTHROPIC_API_KEY |
Anthropic |
GOOGLE_API_KEY |
Gemini / Vertex AI |
OPENCLAW_GATEWAY_TOKEN |
OpenClaw |
ChatChain supports YAML config files for persistent settings and custom provider aliases.
~/.chatchain.yamlor~/.chatchain.yml(global)./.chatchain.yamlor./.chatchain.yml(project-local, merges over global)-c/--config <path>(explicit, highest priority, used alone)
Same-name providers in later files override earlier ones.
For individual values: CLI flag > env var > config file.
# ~/.chatchain.yaml
providers:
openai:
key: sk-official
model: gpt-4o
deepseek: # custom alias
type: openai # underlying provider type
key: sk-deepseek-xxx
url: https://api.deepseek.com/v1
model: deepseek-chat
system: "You are a helpful coding assistant"
claude:
type: anthropic
key: sk-ant-xxx
model: claude-sonnet-4-20250514
# MCP tool servers
mcp_servers:
filesystem:
command: npx
args: ["-y", "@modelcontextprotocol/server-filesystem", "${workspaceFolder}"]
github:
url: https://mcp.example.com/sse
headers:
Authorization: "Bearer ${env:GITHUB_TOKEN}"MCP server config values (command, args, url, env, headers) support VS Code-style variable expansion:
| Variable | Expands to |
|---|---|
${workspaceFolder} / ${cwd} |
Current working directory |
${userHome} |
User home directory |
${pathSeparator} / ${/} |
OS path separator (/ or \) |
${env:VAR} |
Value of environment variable VAR |
Unknown variables are left untouched.
With this config:
# Use the "deepseek" alias — resolves to OpenAI provider with DeepSeek's key/URL/model
chatchain deepseek -m "hello"
# Config key used, no need for -k
chatchain openai -m "hi" -M gpt-4o
# CLI flag overrides config
chatchain openai -k sk-override -m "hi" -M gpt-4oIn interactive mode, the following commands are available:
| Command | Description |
|---|---|
/file <path> |
Attach a file (image, PDF, or text). Supports Tab completion for file paths. |
/files |
List all currently attached files |
/clear |
Remove all attached files |
/save [path] |
Save conversation history to a Markdown file (default: history.md) |
/import [path] |
Import conversation history from a saved Markdown file (default: history.md) |
/mcp |
Show connected MCP servers and their tools |
Attached files are sent with your next message, then cleared automatically.
| Type | Extensions |
|---|---|
| Images | .jpg, .jpeg, .png, .gif, .webp |
| Documents | .pdf |
| Text | .txt, .md, .go, .py, .js, .ts, .json, .yaml, .html, .css, .sql, .csv, and more |
# Interactive model selection
chatchain openai -k sk-xxx
# Specify model directly
chatchain openai -k sk-xxx -M gpt-4o
# Use Anthropic
chatchain anthropic -M claude-sonnet-4-20250514
# Use Gemini
chatchain gemini -M gemini-2.5-flash
# Use Vertex AI (with custom endpoint)
chatchain vertexai -u https://your-proxy.com/api/vertex-ai -M gemini-2.5-flash -m "Hello"
# Use OpenAI Responses API
chatchain openresponses -M gpt-4o -m "Hello"
# Use OpenClaw (connect to gateway, select agent)
chatchain openclaw -u ws://localhost:18789/ws -M main -m "Hello"
# With system prompt
chatchain openai -M gpt-4o -s 'You are a helpful translator' -m "Translate to French: hello"
# Interactive system prompt input (prompts System> after model selection)
chatchain openai -M gpt-4o -S
# Non-interactive mode (requires -M)
chatchain openai -M gpt-4o -m "Explain quicksort in one paragraph"
# Adjust temperature
chatchain anthropic -M claude-sonnet-4-20250514 -t 0.5 -m "Write a haiku"
# Custom API endpoint
chatchain openai -u https://your-proxy.com/v1 -k sk-xxx
# With MCP tools (ad-hoc server via CLI flag)
chatchain openai -M gpt-4o --mcp "npx -y @modelcontextprotocol/server-filesystem /tmp"
# Multiple MCP servers
chatchain anthropic -M claude-sonnet-4-20250514 --mcp "npx -y @modelcontextprotocol/server-filesystem /tmp" --mcp "https://mcp.example.com/sse"
# MCP servers from config file are loaded automatically
chatchain openai -M gpt-4o
# Read message from stdin (pipe-friendly)
echo "Explain quicksort" | chatchain openai -M gpt-4o -m -
cat prompt.txt | chatchain openai -M gpt-4o -m -
# Use a provider alias from config
chatchain deepseek -m "Explain quicksort"
# List all configured providers and aliases
chatchain -l
# List available models for a provider
chatchain -l openai
chatchain -l deepseekYou> /file photo.png
Attached: photo.png (image/png, 245760 bytes)
You> /file report.pdf
Attached: report.pdf (application/pdf, 102400 bytes)
You> /files
[1] photo.png (image/png, 240.0 KB)
[2] report.pdf (application/pdf, 100.0 KB)
You> Summarize the report and describe the photo
Assistant> ...
chatchain/
├── main.go # Entry point
├── cmd/
│ └── root.go # CLI definition (cobra)
├── config/
│ └── config.go # Config file loading and merging
├── chat/
│ ├── chat.go # Chat loop, model selection, tool-call loop, spinner
│ ├── file.go # File attachment reading and MIME detection
│ ├── markdown.go # Streaming markdown highlighter (ANSI)
│ └── styles.go # Terminal style definitions
├── mcp/
│ └── manager.go # MCP client manager (stdio + HTTP transports)
└── provider/
├── provider.go # Provider interface, ToolProvider, Message types
├── openai.go # OpenAI Chat Completions (+ tool calling)
├── openresponses.go # OpenAI Responses API (+ tool calling)
├── anthropic.go # Anthropic (+ tool calling)
├── gemini.go # Gemini (+ tool calling)
├── vertexai.go # Vertex AI (+ tool calling)
└── openclaw.go # OpenClaw Gateway
- cobra — CLI framework
- fatih/color — Terminal styling
- promptui — Interactive prompts
- readline — Line editing with tab completion (CJK-aware)
- spinner — Loading spinners
- openai-go — OpenAI SDK
- anthropic-sdk-go — Anthropic SDK
- go-genai — Google Gemini SDK
- openclaw-go — OpenClaw Gateway SDK
- go-sdk (MCP) — Model Context Protocol SDK
MIT