story-kit: create 174_story_matrix_chatbot_interface_for_story_kit

This commit is contained in:
Dave
2026-02-25 11:49:26 +00:00
parent dfd6ef2109
commit ed7eebc236

View File

@@ -1,21 +1,16 @@
--- ---
name: Matrix Chatbot Interface for Story Kit name: Matrix Bot with LLM Conversation
--- ---
# Matrix Chatbot Interface for Story Kit # Matrix Bot with LLM Conversation
## User Story ## User Story
As a developer, I want to interact with Story Kit through a Matrix chat room so that I can create stories, assign agents, and monitor pipeline progress conversationally from any Matrix client (Element, Element X, mobile) without needing the web UI open. As a developer, I want to talk to Story Kit through a Matrix chat room so that I can create stories, assign agents, and manage the pipeline conversationally from any Matrix client (Element, Element X, mobile).
## Background ## Background
Story Kit currently requires the web UI or direct file manipulation to manage the pipeline. A Matrix bot built into the server bridges the gap between the existing internals and a conversational interface, enabling: Story Kit currently requires the web UI or direct file manipulation to manage the pipeline. A Matrix bot built into the server provides a conversational interface powered by an LLM with access to Story Kit's MCP tools. Users talk naturally — "we need a dark mode feature", "what's stuck?", "put a coder on 42" — and the LLM interprets intent and calls the appropriate tools.
- Mobile access (manage agents from your phone)
- Group collaboration (multiple people in a room managing work together)
- Social coding (see agent activity as chat messages, discuss stories in-thread)
- Future distributed computing (multiple Story Kit nodes coordinated via Matrix)
Matrix is the right platform because: Matrix is the right platform because:
- Self-hosted (Conduit already running) - Self-hosted (Conduit already running)
@@ -30,56 +25,41 @@ Matrix Room
| |
v v
Story Kit Server Story Kit Server
|-- matrix module (matrix-sdk) -- receives messages, posts updates |-- matrix module (matrix-sdk) -- receives messages, posts responses
|-- LLM (Anthropic API) -------- interprets intent, decides actions |-- LLM (Anthropic API) -------- interprets intent, decides actions
|-- MCP tools ------------------- create_story, start_agent, list_agents, etc. |-- MCP tools ------------------- create_story, start_agent, list_agents, etc.
|-- watcher_tx ------------------ pipeline events pushed to room
``` ```
The bot is an LLM agent with access to Story Kit's MCP tools, using Matrix as its transport. Users talk naturally — "we need a dark mode feature", "what's stuck?", "put a coder on 42" — and the LLM interprets intent and calls the appropriate tools. No special command syntax needed.
The Matrix module is built into the server process (`server/src/matrix/`). It: The Matrix module is built into the server process (`server/src/matrix/`). It:
1. Connects to the Matrix homeserver as a bot user on server startup 1. Connects to the Matrix homeserver as a bot user on server startup
2. Joins configured room(s) 2. Joins a configured room
3. Passes incoming messages to an LLM with Story Kit MCP tools available 3. Passes incoming messages to an LLM with Story Kit MCP tools available
4. Posts LLM responses back to the room 4. Posts LLM responses back to the room
5. Subscribes to `watcher_tx` broadcast channel and posts live pipeline updates
Benefits of building it in: Benefits of building it in:
- Direct access to `AppContext`, `AgentPool`, pipeline state - Direct access to `AppContext`, `AgentPool`, pipeline state
- Subscribes to existing broadcast channels (`watcher_tx`, `reconciliation_tx`) for live events
- Single process to manage - Single process to manage
- MCP tools already exist — the LLM uses the same tools that CLI agents use - MCP tools already exist — the LLM uses the same tools that CLI agents use
## Acceptance Criteria ## Acceptance Criteria
### Phase 1: Matrix Connection
- [ ] New `server/src/matrix/` module that connects to a Matrix homeserver using `matrix-sdk` - [ ] New `server/src/matrix/` module that connects to a Matrix homeserver using `matrix-sdk`
- [ ] Bot reads configuration from `.story_kit/bot.toml` (homeserver URL, bot user credentials, room ID(s)) - [ ] Bot reads configuration from `.story_kit/bot.toml` (homeserver URL, bot user credentials, room ID)
- [ ] Bot connection is optional — server starts normally if `bot.toml` is missing or Matrix is disabled - [ ] Bot connection is optional — server starts normally if `bot.toml` is missing or Matrix is disabled
- [ ] Bot joins configured room(s) on startup - [ ] Bot joins configured room on startup
- [ ] Bot ignores its own messages (no echo loops) - [ ] Bot ignores its own messages (no echo loops)
### Phase 2: LLM-Powered Conversation
- [ ] Incoming room messages are passed to an LLM (Anthropic API) with Story Kit MCP tools - [ ] Incoming room messages are passed to an LLM (Anthropic API) with Story Kit MCP tools
- [ ] The LLM can call MCP tools to answer questions and take actions (create stories, assign agents, check pipeline status, etc.) - [ ] The LLM can call MCP tools to answer questions and take actions (create stories, assign agents, check pipeline status, etc.)
- [ ] LLM responses are posted back to the room as the bot user - [ ] LLM responses are posted back to the room as the bot user
- [ ] Conversation context is maintained per-room (the bot remembers recent messages)
- [ ] Bot handles multiple rooms independently
### Phase 3: Live Updates
- [ ] Bot subscribes to `watcher_tx` broadcast channel and posts pipeline changes to the room
- [ ] Agent state changes (started, completed, failed) appear as room messages
- [ ] Stories moving between pipeline stages generate notifications
- [ ] Messages are concise and formatted for readability (not noisy)
## Out of Scope ## Out of Scope
- E2EE (can be added later, start with unencrypted room) - Conversation context / message history (see story 176)
- Multi-project support (single project per bot instance) - Live pipeline update feed (see story 175)
- Distributed multi-node coordination (future story) - Multi-room support (see story 176)
- E2EE (can be added later)
- Distributed multi-node coordination
- Web UI changes - Web UI changes
- Permission/auth model for who can run commands - Permission/auth model for who can run commands
- Voice messages or media handling
## Technical Notes ## Technical Notes
- Use `matrix-sdk` crate for Matrix client - Use `matrix-sdk` crate for Matrix client
@@ -89,10 +69,3 @@ Benefits of building it in:
- Bot spawns as a `tokio::spawn` task from `main.rs`, similar to the watcher and reaper tasks - Bot spawns as a `tokio::spawn` task from `main.rs`, similar to the watcher and reaper tasks
- LLM calls use the same Anthropic API path the server already uses for the web UI chat - LLM calls use the same Anthropic API path the server already uses for the web UI chat
- MCP tool definitions are already registered at `POST /mcp` — the LLM can use the same tool schemas - MCP tool definitions are already registered at `POST /mcp` — the LLM can use the same tool schemas
## Future Considerations
- Thread-based story discussions (Matrix threads per story)
- Code review in-chat (show diffs, approve/reject)
- Distributed mode: a separate coordinator bot that sits above multiple Story Kit nodes, farms out work based on node capacity, and aggregates status. Each node keeps its built-in bot as a local control interface the coordinator talks to.
- Bridge to Signal/WhatsApp via Matrix bridges
- Bot personality/tone configuration