A Docker Compose-based CLI orchestrator for local LLM stacks — spin up pre-wired inference backends, frontend UIs, RAG, voice, image generation, and more with a single command
Harbor is a full-stack orchestrator for local LLM developers. With a single harbor up command, it spins up a pre-configured, pre-wired LLM stack via Docker Compose.
Core capabilities span 14+ inference backends (Ollama, vLLM, llama.cpp, TGI, SGLang, LMDeploy, KTransformers, TabbyAPI, Aphrodite Engine, mistral.rs, LiteLLM, AirLLM, KoboldCpp, Modular MAX), 7+ frontend UIs (Open WebUI, LibreChat, HuggingFace ChatUI, Lobe Chat, Hollama, BionicGPT, AnythingLLM), Web RAG (SearXNG + Perplexica/Morphic/Local Deep Research), voice chat (Speaches STT/TTS), image generation (ComfyUI + FLUX), MCP ecosystem management (MetaMCP + mcpo), and low-code workflows (Dify, n8n, LangFlow, Flowise, Open WebUI Pipelines, LitLytics, FloWise). Built-in Traefik reverse proxy handles routing automatically; built-in cloudflared enables public tunneling; QR codes provide quick mobile access.
The CLI prioritizes usability: order-agnostic arguments (e.g., harbor model vllm equals harbor vllm model), harbor how self-service Q&A, profile save/switch/URL import, and local command history. Harbor Bench provides built-in benchmarking; Harbor Boost offers scriptable optimization agents. harbor eject exports the current orchestration as a standalone docker-compose file for running independently. A companion desktop app (Harbor App) provides visual management.
Architecturally, Harbor implements no AI inference itself — it acts purely as an orchestration glue layer, automatically handling Docker networking, environment variable injection, and service dependencies. It uses multiple runtimes (Shell/Deno/Node.js/Python), with primary language breakdown: TypeScript 37.3%, Python 20.8%, HTML 19.8%, Shell 15.6%. Services are organized as independent directories under services/, with profiles/ for profile storage, skills/ for skill modules, and boost/ for the optimization agent module. v0.4.12+ introduced a workspace-init sidecar pattern to solve bind mount permission issues for 17+ services.
Requires Docker Engine + Docker Compose. Supports Linux and macOS. Installed via a one-line curl script to ~/.harbor and linked to PATH. Latest version v0.4.14 at time of recording, with 136 releases total and high iteration frequency. Licensed under Apache-2.0.