Open-source MCP Host framework that wraps MCP Servers into full-featured AI Agents with reasoning, tool orchestration, and rich interactive UI.
Nanobot is an open-source MCP Host framework developed by Obot AI (formerly Acorn Labs, $35M Seed funding), designed to upgrade MCP Servers that only expose tool functions into complete AI Agents with personality, reasoning capabilities, and user interfaces. The project uses Go (84.4%) for the backend MCP Host engine and Svelte (10.1%) for the embedded chat UI, supporting Agent definition through either a single nanobot.yaml file or a directory-based agents/*.md configuration.
Core Capabilities#
- MCP Host Engine: Runs as an independent MCP Host, connecting MCP Servers with LLMs, targeting full MCP + MCP-UI compliance
- Agent Definition & Orchestration: Single-file config for Agent name, model, and bound MCP Servers; directory config supports multiple Agents with
agents/main.mdas the default entry - Multi-LLM Providers: Built-in OpenAI (e.g., gpt-4) and Anthropic (e.g., claude-3) support, extensible to Azure OpenAI, AWS Bedrock, Ollama, etc. via
llmProviderswith flexible API protocol dialects (OpenAIResponses, OpenAIChatCompletions, AnthropicMessages, OpenResponses) and custom endpoints - MCP-UI Rich Interaction: Supports the MCP-UI protocol, enabling Agents to render rich interactive elements like buttons, graphics, forms, product cards, and shopping carts within the chat interface
- Embedded Web UI: Svelte-based chat interface, running at
http://localhost:8080by default - Multi-Channel Deployment: Agents can interact with users via chat, voice, SMS, email, Slack, and more
Typical Scenarios#
- Quickly wrapping internal APIs/tools as conversational frontends
- Building game Agents (e.g., Blackjack Dealer) with full game UI rendered in chat
- E-commerce shopping Agents integrating with Shopify, displaying product cards, shopping carts, and order history
- Deploying as standalone services or embedding in third-party applications
Installation & Quick Start#
# Homebrew
brew install obot-platform/tap/nanobot
# From source
make
Minimal config example (nanobot.yaml):
agents:
dealer:
name: Blackjack Dealer
model: gpt-4.1
mcpServers: blackjackmcp
mcpServers:
blackjackmcp:
url: https://blackjack.nanobot.ai/mcp
export OPENAI_API_KEY=sk-...
nanobot run ./nanobot.yaml
Access http://localhost:8080 in your browser after launch. The project also provides Dockerfile and Dockerfile.agent for containerized deployment.
Architecture Highlights#
- Backend: Go implementation, entry point
main.go, core logic inpkg/directory - Frontend: Svelte implementation in
packages/ui/, managed with pnpm workspace, dev server on port 5173 proxied by backend on port 8080 - Build & Release: Managed via
Makefile+go.mod, automated releases with.goreleaser.yml - Testing & CI:
integration_test/for integration tests,.github/workflows/for CI/CD - Organization: Complementary product to Obot MCP Gateway under the same Obot AI organization
Important Notes#
The project is currently in Alpha stage (v0.0.80, 81 releases), with the README explicitly stating it is "under heavy development" and "moving away from its original design and intent", with significant breaking changes and architectural shifts expected. The nanobot.ai domain now redirects to obot.ai, and it remains unclear whether Nanobot will be fully merged into the Obot platform. Licensed under Apache-2.0.