DISCOVER THE FUTURE OF AI AGENTS

MCP Agent Mail

Added May 4, 2026
Agent & Tooling
Open Source
PythonWorkflow AutomationMulti-Agent SystemFastAPIModel Context ProtocolAI AgentsAgent & ToolingDeveloper Tools & CodingProtocol, API & Integration

An asynchronous, email-like coordination layer for AI coding agents, providing identity registration, threaded messaging, file reservations, and Git-based auditing.

MCP Agent Mail is a communication and coordination middleware designed for multi-AI coding agent parallel development. It abstracts agent interactions into a familiar email-like paradigm, exposing services via an HTTP-only FastMCP server.

Core Capabilities#

Identity & Security

  • Agents can register temporary but persistent identities (e.g., GreenCastle) with registration_token authentication
  • PyNaCl-based signing key generation, bearer token authentication, TOCTOU vulnerability fixes, and gitignore protection against private key commits

Messaging & Communication

  • Agents send/receive messages via GitHub-Flavored Markdown with image attachment support
  • Messages organized by thread_id with full-text search, summarization, and threaded conversation views
  • MCP resource URI support (resource://inbox/..., resource://thread/...) for fast reads

File & Conflict Management

  • Advisory file/glob reservations with TTL, supporting exclusive and shared modes
  • Pre-commit hook with AGENT_NAME environment variable physically blocks commits conflicting with other agents' active exclusive reservations

Advanced Coordination

  • Built-in macro system encapsulates high-frequency workflows — session initialization (macro_start_session), file reservation cycles (macro_file_reservation_cycle), cross-project handshakes (macro_contact_handshake) — reducing tool-call complexity for smaller models
  • Single-project bus mode and cross-repository coordination (request_contact/respond_contact)
  • Agent directory showing currently active agents, programs/models, and activity status

Architecture#

The project uses a layered architecture with core logic in src/mcp_agent_mail/ and frontend in web/.

  • Access Layer: FastMCP (>= 2.10.5) HTTP-only mode, powered by FastAPI + Uvicorn
  • Business Logic Layer: SQLModel + SQLAlchemy (asyncio) ORM for agent registration, message routing, and file reservation; macro engine parses high-level directives into fine-grained tool calls
  • Persistence Layer: Structured indexing via SQLite (default) / PostgreSQL; unstructured audit data and attachments archived via GitPython to a Git repository
  • Supporting Components: Typer + Rich CLI, structlog structured logging, markdown2 + bleach safe rendering pipeline, Alembic database migrations

Installation & Deployment#

One-click install (recommended):

curl -fsSL "https://raw.githubusercontent.com/Dicklesworthstone/mcp_agent_mail/main/scripts/install.sh?$(date +%s)" | bash -s -- --yes

This automatically installs uv/jq, creates a Python 3.14 venv, configures detected coding agents, starts the server on port 8765 with a masked bearer token, and installs the am alias plus Beads task tracker.

Manual install:

git clone https://github.com/Dicklesworthstone/mcp_agent_mail
cd mcp_agent_mail
uv python install 3.14
uv venv -p 3.14
source .venv/bin/activate
uv sync
scripts/automatically_detect_all_installed_coding_agents_and_install_mcp_agent_mail_in_all.sh

Containerized deployment: Dockerfile, docker-compose.yml, and compose.yaml are provided. After installation, run am in any terminal to start the server.

Configuration & Integration#

  • CLI: Entry point python -m mcp_agent_mail.cli with port configuration and other management operations
  • Client config templates: Out-of-the-box MCP config files for .mcp.json, cline.mcp.json, codex.mcp.json, cursor.mcp.json, gemini.mcp.json, windsurf.mcp.json
  • Agent prompt integration: Ready-to-paste guidance for AGENTS.md / CLAUDE.md, plus SKILL.md for Claude Code auto-discovery
  • Tool granularity: Fine-grained tools (register_agent, file_reservation_paths, send_message, fetch_inbox, acknowledge_message) and coarse-grained macros for models of different capabilities
  • Verified integrations: Claude Code, Codex, Gemini CLI, Cursor, Windsurf, Factory Droid

Important Notes#

The project uses an MIT license with a special rider that explicitly prohibits use by OpenAI, Anthropic, and their affiliates. Current version is v0.3.2, development status is Alpha. No standalone official website exists (pyproject.toml Homepage is a placeholder). A commercial Companion iOS App is mentioned in the README but no link is provided; its exact feature boundaries are unconfirmed.

Related Projects

View All

STAY UPDATED

Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.