MicroClaw
A Rust-based AI assistant with a channel-agnostic agent loop at its center. One core loop drives behavior across all channels—Telegram, Discord, Slack, Feishu, Web, and more—so you get consistent agent logic without channel-specific forks.
Key Facts
- Language: Rust
- Architecture: One channel-agnostic agent loop; one provider-agnostic LLM layer; channel adapters for ingress/egress
- Channels: Telegram, Discord, Slack, Feishu/Lark, Web, Matrix, WhatsApp, iMessage, Email, Nostr, Signal, DingTalk, QQ, IRC
- Tools: Shell, file, search, web; scheduling; chat export; sub-agent delegation; skills + MCP federation
- Memory: File memory (AGENTS.md) at global and chat scopes; structured SQLite; reflector extraction; dedupe/supersede lifecycle; observability endpoints
- LLM providers: Anthropic, OpenAI, OpenAI-compatible APIs
- License: MIT
Best For
Chat-native agent workflows where memory quality and lifecycle matter. Teams that want one agent loop driving multiple channels with shared governance for tools, memory injection, and session resume. The memory pipeline—with quality gates, deduplication, and observability—is designed for long-lived sessions.
The repo stays approachable with a clear central runtime path, which helps smaller teams ship quickly and keep incident triage short.
Compared to OpenClaw
MicroClaw standardizes agent behavior first, then plugs in channels. That yields high behavioral consistency and lower maintenance overhead for shared logic. OpenClaw's integration model puts more responsibility on the user; MicroClaw's channel-agnostic loop reduces that burden for chat-centric use cases. Memory lifecycle governance and quality visibility are first-class, not afterthoughts.