Make any website agent-ready with a single JSON file.
AWP defines agent.json — a machine-readable file at /.well-known/agent.json that tells AI agents what your site can do. Like robots.txt for crawlers, but for AI agents.
The web was built for human eyes and hands. AI agents fail on it constantly — not because the AI isn't capable, but because:
- Websites require JavaScript rendering before content appears
- Auth flows expire with no agent recovery path
- CAPTCHAs and MFA are designed to block non-humans
- Error messages are written for humans, not machines
- No standard exists to declare what actions are available on any surface
There is no equivalent of robots.txt for agent capabilities. Agents are left to infer, guess, and fail. Agent Web Protocol fixes this.
robots.txt → what crawlers cannot access
sitemap.xml → what pages exist
llms.txt → what content means
agent.json → what agents can DO
Any domain publishes a file at /agent.json — the same convention as
robots.txt. Agents discover it automatically. No intermediary required.
The file declares:
- Actions — what the agent can do, with typed inputs and outputs
- Auth — what requires authentication and how to refresh tokens
- Errors — every failure state and its recovery instruction
- Dependencies — which actions must precede others
- Hints — semantic guidance for agent planning
→ Full specification
→ Schema reference and examples
→ Validator
Current version: v0.1 (draft)
Status: Open RFC — feedback welcome
This spec is community-governed. Propose changes via GitHub issues and pull requests.
Open an issue to propose a change or submit a pull request. Contact: spec@agentwebprotocol.org