lit-mux: Enterprise Multi-Agent Orchestration Platform#
lit-mux is a backend platform for deploying enterprise AI capabilities. It provides a robust API for multi-agent orchestration, session management, and LLM integration—enabling organizations to build custom frontends on a production-ready foundation.
Platform Architecture#
┌─────────────────────────────────────────────────────────────────┐
│ Custom Frontend │
│ (Your Application / Reference UI) │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ lit-mux API │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────────────┐ │
│ │ Sessions │ │ Agents │ │ Tools │ │ Organizations │ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────────────┘ │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────────────┐ │
│ │ Streams │ │ Messages │ │ Terminal │ │ System Prompts │ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────────────┘ │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ LLM Backends │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────────────┐ │
│ │ Claude │ │ Gemini │ │ Codex │ │ Ollama (local) │ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────────────┘ │
│ ┌──────────┐ ┌──────────┐ │
│ │Perplexity│ │Claude CLI│ │
│ └──────────┘ └──────────┘ │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ MCP Tools │
│ Jira • Git • Filesystem • Mattermost • Custom Tools │
└─────────────────────────────────────────────────────────────────┘
Core Capabilities#
Multi-Agent System#
Deploy multiple AI agents with distinct roles, backends, and configurations:
- Agent Registry: Manage agent configurations, status, and metadata
- Agent-to-Agent Messaging: Agents communicate via typed message queues
- Agent Hierarchy: Parent-child relationships for organizational structure
- Per-Agent MCP Servers: Each agent can have its own toolset
- Heartbeat System: Autonomous agents that wake periodically to process stimuli
Session Management#
Robust conversation persistence and management:
- Session CRUD: Create, read, update, delete sessions via REST API
- Pagination: Load messages with offset/limit for large conversations
- Session Search: Full-text search across conversation history
- Session Metadata: Store custom data (username, backend preferences, etc.)
- Conversation Compaction: Summarize long conversations to manage context
Multiple LLM Backends#
Switch between providers without changing your application:
| Backend | Description | Use Case |
|---|---|---|
| Claude CLI | Anthropic via Claude Code CLI | Full tool use, streaming |
| Claude API | Direct Anthropic API | Lower latency, API-based |
| Gemini | Google's Gemini models | Alternative provider |
| Codex | OpenAI Codex models | Code generation |
| Ollama | Local open-source models | Privacy, no API costs |
| Perplexity | Web-search augmented | Research, current events |
MCP Tool Integration#
Model Context Protocol for extending AI capabilities:
- Built-in Servers: Jira, Git, filesystem access, agent messaging
- Custom Servers: Add your own MCP servers for business integrations
- Per-Agent Configuration: Different agents can have different tools
- Tool Discovery: API endpoints to list available tools
Real-Time Streaming#
Multiple streaming patterns for responsive UIs:
- Server-Sent Events (SSE): Standard streaming for web clients
- WebSocket: Bidirectional streaming for terminals, telemetry
- Stream Reconnection: Resume interrupted streams by ID
- Stream Cancellation: Cancel in-progress generation
Terminal Integration#
Remote terminal access for development and operations:
- WebSocket Terminal: Full PTY sessions via WebSocket
- Buffer Reading: Access terminal screen content programmatically
- Session Management: Create, list, and manage terminal sessions
API Overview#
The lit-mux API is organized around these resource types:
Sessions#
POST /sessions Create session
GET /sessions List sessions
GET /sessions/{id} Get session
PATCH /sessions/{id} Update session
DELETE /sessions/{id} Delete session
GET /sessions/{id}/messages Get messages (paginated)
POST /sessions/{id}/messages Add message
POST /sessions/{id}/stream Stream AI response
POST /sessions/{id}/compact Compact conversation
Agents#
POST /agents Create agent
GET /agents List agents
GET /agents/{id} Get agent
DELETE /agents/{id} Delete agent
GET /agents/{id}/sessions Get agent's sessions
POST /agents/{id}/stream Stream to agent
GET /agents/{id}/heartbeat/status Heartbeat status
Messages#
POST /messages/agent Send agent-to-agent message
GET /messages/{id} Get message status
POST /messages/agent-to-user Send message to user
Tools & Backends#
GET /backends List available backends
GET /models List available models
GET /tools List MCP tools
GET /mcp/servers List MCP servers
GET /mcp/health MCP health check
System#
GET /health Health check
GET /system-prompt Get system prompt
GET /system-prompt/history Prompt version history
WebSocket Endpoints#
/ws/terminal/{id} Terminal session
/ws/telemetry Real-time telemetry
/ws/heartbeat Heartbeat notifications
Enterprise Security#
Keycloak Integration#
Native enterprise identity management:
- Single Sign-On (SSO): Employees use existing corporate credentials
- Federated Authentication: Active Directory, LDAP, SAML, OAuth
- Execute as User: AI operations run with user's actual permissions
- Complete Audit Trails: All actions logged with user identity
Zero Trust Architecture#
- Every request authenticated
- Permission verification per operation
- Session management with timeouts
- Multi-factor authentication support
Reference Implementation#
lit-mux includes a reference web UI that demonstrates the platform's capabilities. This is not a product—it's architecture documentation in the form of working code.
What the reference UI demonstrates:
- Session management with temporal grouping
- Real-time streaming with SSE
- Voice input via Web Speech API
- Push-to-talk mobile interaction
- Photo upload and display
- Multi-backend switching
- Agent selection
- Search and navigation
Your frontend can be completely different. The reference UI is one possible implementation. Organizations typically build:
- Custom chat interfaces matching their brand
- Specialized workflow applications
- Mobile apps
- CLI tools
- Integration into existing applications
See Reference UI Architecture for implementation details.
Documentation#
- Reference UI Architecture - How the reference frontend works
- MCP Tools - Extending AI with custom tools
- Self-Hosted Models - Running local LLMs with Ollama
- Prompt Engineering - Configuring system prompts
- Workflow Canvas - Visual multi-agent orchestration
Why lit-mux?#
For Platform Teams#
- One backend, many frontends: Build the API once, deploy everywhere
- Backend flexibility: Switch LLM providers without changing client code
- Enterprise-ready: Authentication, audit trails, session management
For Developers#
- Clean API: RESTful design with comprehensive endpoints
- Real-time first: Streaming baked in from the start
- Tool extensibility: MCP protocol for custom integrations
For Organizations#
- On-premises deployment: Your data stays on your infrastructure
- Multi-tenant ready: Organizations, agents, and sessions are isolated
- Audit compliance: Complete logging for regulatory requirements