A secure persistent personal AI agent server delivered as a single Rust binary, supporting 20+ LLM providers, 9 communication channels, sandboxed execution, and hybrid memory search.
Moltis is a local-first persistent personal AI agent server delivered as a single Rust binary with no Node.js or npm runtime dependency. It acts as a secure gateway between users and multiple LLM providers, currently supporting 20+ cloud providers including OpenAI, Anthropic, Google Gemini, Mistral, and DeepSeek, along with local model solutions such as Ollama, LM Studio, GGUF, and MLX. Zero-config OAuth is available for GitHub Copilot, OpenAI Codex, and Kimi Code.
On the interaction layer, Moltis provides a unified entry point across 9 channels: Web UI, Telegram, WhatsApp, Discord, Slack, Matrix, Nostr, Microsoft Teams, and iOS PWA. Voice capabilities integrate 8 TTS and 7 STT providers with voice persona and wake-up mode support. The memory system uses a hybrid search architecture combining SQLite, FTS, and vector embeddings, with sessions persisted in JSONL format with automatic compression, cross-session recall, and Cursor-compatible project context.
Security is a core design dimension of Moltis: tool execution is isolated through three sandbox backends — Docker/Podman, Apple Container, or WASM; secrets are managed via secrecy::Secret with automatic redaction in tool outputs; static storage uses XChaCha20-Poly1305 + Argon2id encryption; built-in SSRF protection blocks loopback and private address access; BeforeToolCall hooks can audit or block any tool invocation; and supply chain integrity is supported through Sigstore keyless signing, GPG signing, and GitHub artifact attestation.
Extensibility includes MCP server support (stdio + HTTP/SSE), a built-in Skill system with OpenClaw import, 15 lifecycle hook events (with circuit breakers and destructive command protection), and Wasmtime WASM sandboxed tools. Operations features cover Cron scheduling, CalDAV calendar integration, Tailscale networking, and one-click cloud deployment templates for Fly.io and DigitalOcean. The codebase spans approximately 270K LoC across 59 modular crates, with core agent/gateway code in safe Rust and unsafe code isolated to FFI and WASM boundaries. Platform support covers macOS (Apple Silicon & Intel), Linux (x86_64 & ARM64), Windows (x86_64), and resource-constrained devices like Raspberry Pi via a lightweight feature mode.
Installation:
curl -fsSL https://www.moltis.org/install.sh | sh
brew install moltis-org/tap/moltis
docker pull ghcr.io/moltis-org/moltis:latest
cargo install moltis --git https://github.com/moltis-org/moltis
Unconfirmed: No Hugging Face page found; no associated academic paper; full Windows native support needs further confirmation (Docker works); iOS native app marked as SOON; Signal channel crate exists but not listed on the official site. The author describes the project as alpha, though it already has 90 releases and 3,498+ commits.