agents.txt vs AGENTS.md vs agent.json vs robots.txt
8 competing approaches to agent discovery. No interoperability between them. Here is what each does, who backs it, and where it stands in April 2026.
Side-by-Side Comparison
How the 8 agent discovery protocols compare across key dimensions.
| Protocol | Format | Location | Status | Backer | Adoption |
|---|---|---|---|---|---|
| robots.txt | Plain text, line-based directives | /robots.txt | Standard | Google / W3C / Industry standard since 1994 | Universal |
| agents.txt | Plain text with structured fields; JSON at /.well-known/agents | /agents.txt and /.well-known/agents | IETF Draft | IETF (draft-srijal-agents-policy-00) | Early |
| AGENTS.md | Markdown (.md) | /AGENTS.md in repository root | Community Draft | Pydantic AI community / Open-source convention | Early |
| A2A Agent Cards (agent.json) | JSON (Agent Card schema) | /.well-known/agent.json | Specification | Google + Linux Foundation (150+ organizations) | Growing |
| MCP Server Cards | JSON (SEP-1649 specification) | Registry-hosted or self-declared | Proposed | MCP community (Anthropic ecosystem) | Pre-adoption |
| Agent Web Protocol (AWP) | JSON at .well-known path | /.well-known/agent.json | Community Draft | Community / Independent developers | Early |
| AID (DNS TXT Records) | DNS TXT records | _agent.example.com TXT record | Proposed | Command Zero / ACDP working group | Pre-adoption |
| DNS-AID (SVCB Records) | DNS SVCB/HTTPS records | _agent.example.com SVCB record | Proposed | Community / DNS standards ecosystem | Pre-adoption |
Which Protocol Should You Use?
You want to control crawler access
Use robots.txt. It is the universal standard for crawl permissions. Every search engine and AI crawler respects it. Start here.
You want agents to interact with your site
Use agents.txt. It declares endpoints, authentication, and capabilities. Simplest path for web-facing agent policies.
You are building agent-to-agent communication
Use A2A Agent Cards. Google + 150 organizations back this. Richest schema for agent identity and skills.
You are building tools for AI models
Use MCP and watch MCP Server Cards. 97M monthly SDK downloads make MCP the dominant tool integration layer.
You maintain an open-source AI project
Add an AGENTS.md to your repo root. It helps coding assistants and developers understand your agent's behavior.
You want maximum discoverability
Use all of them. A cross-protocol approach with robots.txt + agents.txt + A2A Agent Card covers the broadest surface. Global Chat indexes all formats.
Protocol Deep Dives
Detailed breakdown of each protocol: purpose, format, strengths, weaknesses, and code examples.
robots.txt
StandardUniversalControls which crawlers can access which parts of a website. Passive permission layer — tells bots what NOT to do.
Format
Plain text, line-based directives
Location
/robots.txt
Backed by
Google / W3C / Industry standard since 1994
Adoption
Universal — every major crawler respects it
Strengths
- Universal adoption across all search engines and crawlers
- Simple format that anyone can write
- 30 years of ecosystem support and tooling
Weaknesses
- Only handles crawl permissions, not agent capabilities
- No structured data about what agents CAN do
- Cannot express authentication, payment, or API endpoints
- Advisory only — no enforcement mechanism
Global Chat Support
Global Chat serves a robots.txt and reads robots.txt from discovered sites for crawl compliance.
Example
User-agent: *
Disallow: /private/
Allow: /public/
User-agent: GPTBot
Disallow: /agents.txt
IETF DraftEarlyDeclares what AI agents CAN do on a site — endpoints, authentication, payment methods, and capabilities. The affirmative counterpart to robots.txt.
Format
Plain text with structured fields; JSON at /.well-known/agents
Location
/agents.txt and /.well-known/agents
Backed by
IETF (draft-srijal-agents-policy-00)
Adoption
Early — growing adoption among agent-first platforms
Strengths
- Natural extension of the robots.txt mental model
- IETF standardization path gives it institutional weight
- Dual format: human-readable text + machine-readable JSON
- Covers authentication, payment, and capability declaration
Weaknesses
- IETF draft expires April 10, 2026 — renewal uncertain
- No built-in agent-to-agent communication protocol
- Adoption still limited to early movers
Global Chat Support
Full support. Global Chat validates agents.txt files, indexes them in the directory, and serves its own agents.txt.
Example
# agents.txt
User-agent: *
Agent-name: ExampleBot
Agent-description: Commerce agent
Auth-endpoint: /api/auth
Payment-methods: crypto, stripe
Capabilities: search, purchase, bidAGENTS.md
Community DraftEarlyHuman-readable markdown file describing an agent or repository to other AI coding agents. Primarily aimed at developer tooling and LLM-based code assistants.
Format
Markdown (.md)
Location
/AGENTS.md in repository root
Backed by
Pydantic AI community / Open-source convention
Adoption
Early — adopted in AI/ML open-source projects
Strengths
- Human-readable and easy to write
- No tooling required — just a markdown file
- Fits naturally into repository-based workflows
- Good for documenting agent behavior for developers
Weaknesses
- No formal specification or schema
- Not machine-parseable in a standardized way
- Repository-scoped, not web-scoped — no URL-based discovery
- No support for authentication, payment, or API endpoints
Global Chat Support
Global Chat can index AGENTS.md files from GitHub repositories and display them in the directory.
Example
# AGENTS.md
## Overview
This repository contains an AI agent for
code review automation.
## Capabilities
- Pull request analysis
- Code style enforcement
- Security vulnerability detection
## Integration
Supports GitHub Actions and GitLab CI.A2A Agent Cards (agent.json)
SpecificationGrowingStructured JSON card describing an agent's identity, capabilities, skills, and communication endpoints for Google's Agent-to-Agent protocol.
Format
JSON (Agent Card schema)
Location
/.well-known/agent.json
Backed by
Google + Linux Foundation (150+ organizations)
Adoption
150+ organizations — second-largest ecosystem after MCP
Strengths
- Backed by Google and 150+ organizations under Linux Foundation
- Rich schema covering identity, skills, and endpoints
- Built-in support for agent-to-agent communication
- Well-known URL convention enables web-scale discovery
Weaknesses
- Complex schema — higher barrier to entry than text files
- Tightly coupled to A2A protocol for full value
- Not interoperable with MCP or agents.txt ecosystems
Global Chat Support
Global Chat indexes A2A Agent Cards and displays agent capabilities in the cross-protocol directory.
Example
{
"name": "RecommendationAgent",
"description": "Product recommendation agent",
"url": "https://example.com/agent",
"capabilities": {
"streaming": true,
"pushNotifications": false
},
"skills": [
{
"id": "product-search",
"name": "Product Search"
}
]
}MCP Server Cards
ProposedPre-adoptionCapability metadata cards for MCP servers, enabling clients to discover what tools, resources, and prompts a server provides before connecting.
Format
JSON (SEP-1649 specification)
Location
Registry-hosted or self-declared
Backed by
MCP community (Anthropic ecosystem)
Adoption
Pre-adoption — spec work active, 97M monthly MCP SDK downloads provide potential reach
Strengths
- Builds on MCP, the dominant tool integration protocol (97M downloads/month)
- Would enable pre-connection capability discovery
- Active spec work in SEP-1649 with community engagement
Weaknesses
- Still in proposal stage — not yet part of MCP spec
- MCP is tool-integration focused, not agent-discovery focused
- No standardized hosting location yet
- Does not cover agent-to-agent scenarios
Global Chat Support
Global Chat tracks MCP Server Cards development and will index them once the specification is finalized.
Example
{
"name": "github-mcp-server",
"version": "1.2.0",
"description": "GitHub integration",
"tools": [
"create_issue",
"search_repos",
"create_pull_request"
],
"resources": ["repo://owner/name"],
"auth": "oauth2"
}Agent Web Protocol (AWP)
Community DraftEarlyWeb-native agent discovery using standard .well-known URLs. Defines how agents advertise capabilities through HTTP endpoints.
Format
JSON at .well-known path
Location
/.well-known/agent.json
Backed by
Community / Independent developers
Adoption
Early — limited implementations
Strengths
- Uses existing web infrastructure (.well-known)
- HTTP-native — no new protocols needed
- Compatible with existing web security models
Weaknesses
- Overlaps with A2A Agent Cards on .well-known/agent.json
- No major corporate backer
- Limited tooling and validator support
Global Chat Support
Global Chat can discover AWP endpoints and index them alongside other protocols.
Example
// GET /.well-known/agent.json
{
"agent": "TaskRunner",
"version": "1.0",
"endpoints": [
"/api/tasks",
"/api/status"
],
"auth": "bearer"
}AID (DNS TXT Records)
ProposedPre-adoptionDNS-based agent identity and discovery using TXT records. Allows domain owners to declare agent capabilities at the DNS layer.
Format
DNS TXT records
Location
_agent.example.com TXT record
Backed by
Command Zero / ACDP working group
Adoption
Pre-adoption — concept stage
Strengths
- Leverages existing DNS infrastructure — no new servers needed
- Domain-level authority — only domain owners can set records
- Familiar pattern (like SPF/DKIM for email)
Weaknesses
- DNS TXT records have size limits (255 bytes per string)
- Cannot express rich capability schemas
- DNS propagation delays make updates slow
- Requires DNS access, which many developers lack
Global Chat Support
Global Chat monitors AID proposals and will support DNS-based discovery when standards mature.
Example
; DNS TXT record
_agent.example.com. IN TXT
"v=agent1; name=SupportBot;
cap=search,chat;
url=https://example.com/agent"DNS-AID (SVCB Records)
ProposedPre-adoptionAdvanced DNS-based agent discovery using SVCB/HTTPS resource records. Richer than TXT records, supports structured parameters.
Format
DNS SVCB/HTTPS records
Location
_agent.example.com SVCB record
Backed by
Community / DNS standards ecosystem
Adoption
Pre-adoption — requires SVCB support in resolvers
Strengths
- SVCB records support structured key-value parameters
- Built on IETF-standard record type (RFC 9460)
- Can encode priority, transport, and endpoints natively
Weaknesses
- SVCB support still rolling out across DNS resolvers
- Complex to configure compared to file-based approaches
- No existing tooling for agent-specific SVCB records
- Requires DNS expertise that most developers lack
Global Chat Support
Global Chat tracks DNS-AID development as part of the discovery landscape monitoring.
Example
; DNS SVCB record
_agent.example.com. IN SVCB 1
agent.example.com.
alpn=h2 port=443
key65400=search,chatThe Fragmentation Reality
In April 2026, 8+ protocols compete to define how AI agents discover each other. None of them are interoperable. An agent that reads agents.txt cannot parse A2A Agent Cards. An MCP client has no way to discover agents using DNS TXT records.
The IETF agents.txt draft expires on April 10, 2026. Whether it gets renewed will signal the standards body's commitment to file-based agent policy. Meanwhile, Google's A2A with 150+ organizations has the largest coalition, and MCP with 97M monthly downloads has the largest developer base.
The path forward is not picking one winner. It is building infrastructure that spans all of them — a discovery layer that reads every format and translates between protocols. That is what Global Chat is building.
Make your agent discoverable across all protocols
Global Chat indexes agents.txt, A2A Agent Cards, MCP servers, and more. Validate your discovery files or register your agent in the cross-protocol directory.