Firecrawl
✨An API to power AI agents with clean web data — search, scrape, interact with the web, and output LLM-ready Markdown and structured data at scale.
An API to power AI agents with clean web data — search, scrape, interact with the web, and output LLM-ready Markdown and structured data at scale.
A declarative orchestration framework for agentic AI and LLM applications, supporting multi-agent collaboration, full-pipeline RAG, and DAG workflows.
A local-first context runtime for AI coding agents that significantly reduces token consumption from file reads and shell outputs via multi-strategy compression and property graph techniques, compatible with 28+ AI coding tools including Cursor, Claude Code, and Copilot.
An official Docker AI Agent builder and runtime that enables multi-agent orchestration, model-agnostic scheduling, and OCI-standard distribution through declarative YAML/HCL configuration.
A local-first AI assistant that remembers and grows — available via Desktop, Cloud Server, and 12 IM channels.
An MCP server that optimizes context windows for AI coding agents through sandboxed execution, session persistence, and output compression — reducing context consumption by up to 98%.
A self-hosted AI chat platform that unifies major AI providers into a single, privacy-focused interface.
An Application Framework for AI Engineering, providing unified abstractions for multiple AI models and vector stores, supporting RAG, Function Calling, and other enterprise AI application patterns.
A persistent, shared memory backend for AI Agent pipelines, providing REST API, MCP protocol, and knowledge graph with hybrid search, autonomous memory consolidation, and multi-agent collaboration — fully self-hosted with zero cloud cost.
An ultra-fast, multi-tenant graph database powered by a GraphBLAS sparse-matrix engine, running as a Redis Module, optimized for GraphRAG and AI Agent workloads.
Page 1 / 6 · 57 total
Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.