Agent-first job runner for API workflows.
Write a job. Brief your agent. Let it dispatch.
npm install -g dispatchkitDispatch is a modular CLI for running API workflows — built from the ground up to be operated by AI agents, and comfortable for the humans who write the jobs.
Most CLIs are built for humans and tolerated by agents. Dispatch inverts this. The primary operator is an AI agent. The human experience is layered on top.
The flow:
developer (natural language)
→ agent reads SKILL.md
→ agent composes job
→ dispatch job validate
→ dispatch job run ← real API calls happen here
→ dispatch job assert
→ agent reports back
| Action | Description |
|---|---|
flow.sleep |
Pause execution for a deterministic duration |
flow.poll |
Call another action repeatedly until conditions match or timeout |
| Action | Description |
|---|---|
memory.store |
Store a value by key in a namespace |
memory.recall |
Recall a value by key from a namespace |
memory.forget |
Forget one key or clear one namespace |
# Install
npm install -g dispatchkit
# Health check
dispatch self-check
dispatch doctor
# Optional: repo-local env loading with direnv
cp .envrc.example .envrc
direnv allow
# Run the built-in flow example
dispatch job validate --case jobs/flow-sleep.job.case.json
dispatch job run --case jobs/flow-sleep.job.case.json
dispatch job assert --run-id latest
# Try the repo jsonplaceholder module
dispatch module inspect jsonplaceholder
dispatch job validate --case modules/jsonplaceholder/jobs/jsonplaceholder-kitchen-sink.job.case.json
dispatch job run --case modules/jsonplaceholder/jobs/jsonplaceholder-kitchen-sink.job.case.json
dispatch job assert --run-id latestFor local development, the recommended workflow is to keep environment-specific values outside the job file and load them automatically when you enter the repo.
- Copy
.envrc.exampleto.envrc - Replace placeholder values with your local values
- Run
direnv allow
cp .envrc.example .envrc
direnv allowThis gives you one repo-local environment bundle for:
- secret values such as usernames and passwords
- non-secret values such as base URLs and shared headers
The job file stays portable and explicit. The local environment stays out of git.
Release notes for maintainers live in docs/release.md. Agent-native product principles live in docs/agent-native.md. HTTP auth/session behavior for module authors lives in docs/modules/http-auth.md. Packaged module-author guidance lives in MODULE_AUTHORING.md and CONVENTIONS.md.
A job case is a portable JSON file — shareable, versionable, replayable.
{
"schemaVersion": 1,
"jobType": "my-workflow",
"scenario": {
"steps": [
{
"id": "pause",
"action": "flow.sleep",
"payload": { "duration": "1s" }
},
{
"id": "wait-for-ready",
"action": "flow.poll",
"payload": {
"action": "probe.get-status",
"payload": {
"id": "${run.targetId}"
},
"intervalMs": 1000,
"maxDurationMs": 10000,
"conditions": {
"mode": "ALL",
"rules": [{ "path": "$.ready", "op": "eq", "value": true }]
},
"store": {
"resourceId": "$.id"
}
}
}
]
}
}- Actions are always namespaced:
module.action - Payloads use intent fields, not raw wire payloads
- Interpolation uses
${env.NAME},${step.<id>.response.<field>},${step.<id>.exports.<field>}, or${jsonpath(step:<id>, <path>)} - Same-run values should flow through
step.*orrun.*, not persistent memory
env.* is the bridge between repo-local setup and portable jobs. With direnv, the values
come from .envrc; in CI they can come from normal environment injection.
If an action generates a same-run workflow value that should not be faked into the transport response, return it under exports and reference it from later steps:
return {
response: { ok: true },
exports: { generatedId },
};{
"id": "consume",
"action": "example.consume",
"payload": {
"generatedId": "${step.publish.exports.generatedId}"
}
}If the job wants a stable workflow-level name instead of coupling later steps to the action’s export key, capture it explicitly into run.*:
{
"id": "publish",
"action": "example.publish",
"capture": {
"workflowId": "exports.generatedId"
}
}Later steps can then use ${run.workflowId}.
Jobs can declare shared request context once at the top level:
{
"http": {
"baseUrl": "https://api.example.com",
"defaultHeaders": {
"x-client": "dispatch"
}
}
}http.baseUrlconfigures the rootctx.httptransport for the whole runhttp.defaultHeadersapplies to all requests in the run unless a handler narrows or overrides them- Cookies/session continuity still live in the shared run transport
- Handlers can still derive narrower clients with
ctx.http.withDefaults(...)
If those values differ by developer machine, environment, or CI target, keep the job explicit
and resolve the actual values from ${env.*}:
{
"http": {
"baseUrl": "${env.DISPATCH_HTTP_BASE_URL}",
"defaultHeaders": {
"x-client": "dispatch",
"x-context": "${env.DISPATCH_HTTP_X_CONTEXT}"
}
}
}This is the recommended way to connect direnv to shared request config.
If the env-backed values are missing or resolve to invalid HTTP config:
dispatch job validatefails before executiondispatch job runalso fails before execution
If a job intentionally relies on shared HTTP context, declare that explicitly in dependencies:
{
"http": {
"baseUrl": "https://api.example.com",
"defaultHeaders": {
"x-client": "dispatch"
}
},
"dependencies": {
"http": {
"required": ["baseUrl", "defaultHeaders.x-client"]
}
}
}job.httpholds the actual valuesdependencies.http.requireddeclares which paths must be presentdispatch job validateanddispatch job runfail early if required HTTP config is missing
Jobs can bind named credential profiles without putting plaintext secrets in the case file:
{
"credentials": {
"adminQa": {
"fromEnv": {
"username": "DISPATCH_ADMIN_USERNAME",
"password": "DISPATCH_ADMIN_PASSWORD"
}
}
},
"scenario": {
"steps": [
{
"id": "login",
"action": "admin.login",
"credential": "adminQa",
"payload": {}
}
]
}
}credentials.<name>.fromEnvmaps credential field names to environment variables- step
credentialbinds one named profile to an action - actions read resolved secrets from
ctx.credential, not from payload dispatch job validatefails if a step is missing a required credential bindingdispatch job runalso fails early if required environment variables are missing
This pairs naturally with direnv: the job stores only env var names, while .envrc
or CI provides the actual secret values.
If an action expects a credential contract, declare it with credentialSchema so module inspect
and schema action --print can surface it.
The intended setup is:
direnvloads environment-specific values into your shell- the job reads non-secret values through
${env.*} - the job binds secrets through
credentials.<name>.fromEnv - steps still bind credentials explicitly with
credential
Example:
{
"http": {
"baseUrl": "${env.DISPATCH_HTTP_BASE_URL}",
"defaultHeaders": {
"x-context": "${env.DISPATCH_HTTP_X_CONTEXT}"
}
},
"credentials": {
"admin": {
"fromEnv": {
"username": "DISPATCH_ADMIN_USERNAME",
"password": "DISPATCH_ADMIN_PASSWORD"
}
}
},
"scenario": {
"steps": [
{
"id": "login",
"action": "admin.login",
"credential": "admin",
"payload": {}
}
]
}
}That keeps the job inspectable and portable while removing the need for repeated manual
export ... commands.
Dispatch distinguishes between portable case jobs and memory-mutating seed jobs:
*.job.case.json- may use
memory.recall - may not use
memory.store - may not use
memory.forget
- may use
*.job.seed.json- may use all memory actions
Memory is for durable cross-run state, not same-run wiring. Persistent values live at:
~/.dispatch/memory/<namespace>.json
Keys are dotted paths that address locations inside the namespace file. The file on disk is plain nested JSON:
{
"users": {
"user-1": { "id": 1, "name": "Leanne Graham" },
"user-2": { "id": 2, "name": "Ervin Howell" }
}
}A seed job step that writes user-1:
{
"id": "store-user",
"action": "memory.store",
"payload": {
"namespace": "reference-data",
"key": "users.user-1",
"value": { "id": 1, "name": "Leanne Graham" }
}
}Jobs can declare explicit prerequisites:
{
"dependencies": {
"modules": [{ "name": "jsonplaceholder", "version": "^0.2.0" }],
"memory": [
{
"namespace": "jsonplaceholder-reference",
"key": "users.user-1",
"fill": {
"module": "jsonplaceholder",
"job": "seed-user-1-reference"
}
}
],
"http": {
"required": ["baseUrl", "defaultHeaders.x-client"]
}
}
}- Module dependencies are validated before execution
- Missing memory dependencies fail early with actionable
next[] - Missing required HTTP config fails before actions execute
dispatch job run --resolve-depscan run fill jobs before the main job- Fill jobs resolve by logical module job id, preferring
<job>.job.seed.jsonover<job>.job.case.json
Every command supports dual output mode:
dispatch job run --case my.job.case.json # human mode
dispatch job run --case my.job.case.json --json # machine modeMachine mode (--json) returns stable JSON envelopes:
{
"cliVersion": "0.0.1",
"jobType": "my-workflow",
"status": "SUCCESS",
"runId": "20260306-162116-job-run-716a47b",
"runDir": "~/dispatch/run-output/20260306-162116-job-run-716a47b",
"moduleResolutionPath": "~/dispatch/run-output/20260306-162116-job-run-716a47b/module_resolution.json",
"next": [
{
"command": "dispatch job assert --run-id 20260306-162116-job-run-716a47b",
"description": "verify outcomes"
}
]
}The next field tells the agent what to do after every command. No reasoning required.
Exit codes:
0success1internal error2usage/input error — do not retry3transient — retry safe4not found
Every run writes a full deterministic record:
run-output/<runId>/
summary.json — status, timing, key outputs
activity.log — step-by-step timeline
job.case.input.json — original job case
job.case.resolved.json — resolved with interpolation applied
meta.json — run metadata
module_resolution.json — which module handled each action
Written on both success and failure. Assert offline. Replay without network.
dispatch job validate --case <path>
dispatch job run --case <path> [--resolve-deps]
dispatch job run-many --case <path> --count <n> --concurrency <n>
dispatch job assert --run-id <id|latest>
dispatch job inspect --run-id <id|latest> [--step <n>]
dispatch job readable --run-id <id|latest>
dispatch job dump --run-id <id|latest> [--out <path>]
dispatch job replay --run-id <id>
dispatch job list [--limit <n>]
dispatch job latest
dispatch job cases
dispatch job export --run-id <id> --out <path>
dispatch job import --file <path>
dispatch job batch-inspect --batch-id <id|latest>dispatch module list
dispatch module inspect <name>
dispatch module validate --path <dir>
dispatch module init --name <name> --out <dir>
dispatch module pack --path <dir> --out <bundle.dpmod.zip>
dispatch module install --bundle <bundle.dpmod.zip>
dispatch module uninstall --name <module>
dispatch module override init --from <module.action> --out <dir>
dispatch module override add --module <module> --action <action> [--path <dir>]dispatch runtime show
dispatch runtime unset [--all]
dispatch defaults show [--action <module.action>]
dispatch defaults set --action <module.action> --file <path>
dispatch defaults unset --action <module.action>
dispatch memory list
dispatch memory inspect --namespace <name>dispatch doctor
dispatch self-check
dispatch schema case --print
dispatch schema action --name <module.action> --print
dispatch skill-version
dispatch completion <bash|zsh|fish>Dispatch modules wrap API surfaces. Three layers, last wins:
builtin src/modules/builtin/* ships with dispatch
repo ./modules/* project-local
user ~/.dispatch/modules/* user-installed bundles
dispatch module init --name payments --out ./modules/paymentsGenerates two files:
payments/
module.json — discovery manifest
index.mjs — runtime module entry
{
"name": "payments",
"version": "0.1.0",
"entry": "index.mjs"
}import { z } from 'zod';
import { defineAction, defineModule } from 'dispatchkit';
async function registerWebhook(ctx, payload) {
return {
response: {
ok: true,
received: payload,
},
detail: 'replace with real implementation',
};
}
export default defineModule({
name: 'payments',
version: '0.1.0',
actions: {
'register-webhook': defineAction({
description: 'Register a webhook endpoint',
schema: z.object({
url: z.string().url(),
}),
handler: registerWebhook,
}),
},
});module.jsonis discovery metadata and runtime entry locationindex.mjsdefault-exports the full module object- external modules may be authored in TS and compiled to JS;
module.json.entryshould point at the built runtime file
Modules can also ship job files under jobs/:
payments/
module.json
index.mjs
jobs/
sync-catalog.job.case.json
cache-reference-data.job.seed.json
dispatch module inspect <name> --jsonlists discovered shipped jobsdispatch schema action --name <module.action> --printincludes input, declared exports, and declared credential schema when presentdispatch module validate --path <dir>validates both handlers and shipped job files- Use seed jobs for cache/bootstrap flows that populate memory for later case jobs
The repository ships a public example module under modules/jsonplaceholder.
Its shipped jobs now demonstrate the intended pattern: each job declares
http.baseUrl, and the module actions use relative HTTP paths through the shared run
transport.
Useful commands:
dispatch module inspect jsonplaceholder
dispatch module validate --path modules/jsonplaceholder
dispatch job validate --case modules/jsonplaceholder/jobs/jsonplaceholder-kitchen-sink.job.case.json
dispatch job run --case modules/jsonplaceholder/jobs/jsonplaceholder-kitchen-sink.job.case.json
dispatch job run-many --case modules/jsonplaceholder/jobs/jsonplaceholder-run-many.job.case.json --count 3Example shipped jobs:
jsonplaceholder-kitchen-sink.job.case.jsonExercisesflow.sleep,flow.poll, interpolation,run.*, and a follow-up create call.jsonplaceholder-relations.job.case.jsonTraverses user -> albums -> photos and user -> posts -> comments.jsonplaceholder-poll.job.case.jsonFocusedflow.pollexample usingjsonplaceholder.get-post.seed-user-1-reference.job.seed.jsonPopulates durable memory underjsonplaceholder-reference.users.user-1.jsonplaceholder-from-memory.job.case.jsonDeclares a memory dependency and recalls the seeded user before listing posts.
Cookie-backed auth flows are handled by ctx.http, not by module-specific storage.
- use the top-level job
httpblock for shared base URLs and default headers - a login action can establish a session with
Set-Cookie - later actions in the same run automatically reuse the session
- cookies are run-scoped only and are not persisted in
memory
See docs/modules/http-auth.md for the module-author contract.
dispatch module pack --path ./modules/payments --out payments.dpmod.zip
dispatch module install --bundle payments.dpmod.zipPacked bundles are runtime-focused by default:
module.json- the runtime entry subtree (for example
dist/) jobs/README.mdwhen present
Authoring files such as src/, tsconfig.json, and bundler configs are not bundled unless the
module manifest explicitly adds extra runtime assets under pack.include.
~/.dispatch/
runtime-overrides.json — global runtime overrides
action-defaults.json — per-action default payloads
modules/ — user-installed module bundles
memory/ — namespaced memory files
<namespace>.json
Drop SKILL.md into your agent's context before any session. The agent reads it once, knows the preflight sequence, the happy path, and the troubleshooting ladder. No exploration required.
dispatch skill-version # verify skill is current- No external telemetry
- Secrets redacted in run artifacts and command output
- Use job-level credential profiles backed by environment variables for sensitive values