Runtime Layer
Rust + Axum + Tokio handles concurrent connections and keeps WebSocket task failures isolated from the core service loop.
Operational outcome: Stable messaging service even when individual client sessions fail.
Architecture
123AIChat architecture is optimized for internal reliability, deterministic delivery boundaries, and practical operational control when AI collaboration moves from pilot to production.
Rust + Axum + Tokio handles concurrent connections and keeps WebSocket task failures isolated from the core service loop.
Operational outcome: Stable messaging service even when individual client sessions fail.
Embedded SurrealDB (RocksDB) persists users, channels, and messages with support for full-text and semantic retrieval.
Operational outcome: Durable collaboration history and replay-friendly audit trails.
Ollama-backed jobs run under semaphore governance to prevent runaway concurrency from degrading core chat responsiveness.
Operational outcome: Predictable model throughput without blocking collaboration traffic.
Single-binary deployment with embedded frontend assets simplifies LAN and on-prem rollouts across environments.
Operational outcome: Faster deployment and fewer moving parts in operations.
Only that connection task exits; channel service continues for remaining participants.
Queue state degrades gracefully while base communication remains available.
Core messaging path remains functional, allowing operators to correct routing without total outage.