diff --git a/.story_kit/work/1_upcoming/174_story_matrix_chatbot_interface_for_story_kit.md b/.story_kit/work/1_upcoming/174_story_matrix_chatbot_interface_for_story_kit.md index 3330fa8..c27a9cd 100644 --- a/.story_kit/work/1_upcoming/174_story_matrix_chatbot_interface_for_story_kit.md +++ b/.story_kit/work/1_upcoming/174_story_matrix_chatbot_interface_for_story_kit.md @@ -6,7 +6,7 @@ name: Matrix Chatbot Interface for Story Kit ## User Story -As a developer, I want to interact with Story Kit through a Matrix chat room so that I can create stories, assign agents, and monitor pipeline progress from any Matrix client (Element, Element X, mobile) without needing the web UI open. +As a developer, I want to interact with Story Kit through a Matrix chat room so that I can create stories, assign agents, and monitor pipeline progress conversationally from any Matrix client (Element, Element X, mobile) without needing the web UI open. ## Background @@ -26,41 +26,46 @@ Matrix is the right platform because: ## Architecture ``` -Matrix (Conduit) <-> Story Kit Server (matrix-sdk built in) - | - AppContext, AgentPool, watcher_tx (direct access) +Matrix Room + | + v +Story Kit Server + |-- matrix module (matrix-sdk) -- receives messages, posts updates + |-- LLM (Anthropic API) -------- interprets intent, decides actions + |-- MCP tools ------------------- create_story, start_agent, list_agents, etc. + |-- watcher_tx ------------------ pipeline events pushed to room ``` -The Matrix bot is built into the server process as a module (`server/src/matrix/`). It: +The bot is an LLM agent with access to Story Kit's MCP tools, using Matrix as its transport. Users talk naturally — "we need a dark mode feature", "what's stuck?", "put a coder on 42" — and the LLM interprets intent and calls the appropriate tools. No special command syntax needed. + +The Matrix module is built into the server process (`server/src/matrix/`). It: 1. Connects to the Matrix homeserver as a bot user on server startup 2. Joins configured room(s) -3. Listens for messages (commands) -4. Calls internal functions directly (no HTTP round-trip to itself) -5. Subscribes to `watcher_tx` broadcast channel for live pipeline updates -6. Posts updates back to the room +3. Passes incoming messages to an LLM with Story Kit MCP tools available +4. Posts LLM responses back to the room +5. Subscribes to `watcher_tx` broadcast channel and posts live pipeline updates Benefits of building it in: -- Direct access to `AppContext`, `AgentPool`, pipeline state — no self-referential HTTP calls +- Direct access to `AppContext`, `AgentPool`, pipeline state - Subscribes to existing broadcast channels (`watcher_tx`, `reconciliation_tx`) for live events -- Single process to manage — no "where is the server?" configuration -- Can restart cleanly with the server +- Single process to manage +- MCP tools already exist — the LLM uses the same tools that CLI agents use ## Acceptance Criteria -### Phase 1: Core Bot Infrastructure +### Phase 1: Matrix Connection - [ ] New `server/src/matrix/` module that connects to a Matrix homeserver using `matrix-sdk` - [ ] Bot reads configuration from `.story_kit/bot.toml` (homeserver URL, bot user credentials, room ID(s)) - [ ] Bot connection is optional — server starts normally if `bot.toml` is missing or Matrix is disabled - [ ] Bot joins configured room(s) on startup -- [ ] Bot responds to a `!status` command with current pipeline state (counts per stage) -- [ ] Bot responds to `!pipeline` with a formatted list of all stories across all stages +- [ ] Bot ignores its own messages (no echo loops) -### Phase 2: Story Management -- [ ] `!create story