story-kit: create 174_story_matrix_chatbot_interface_for_story_kit

This commit is contained in:
Dave
2026-02-25 11:47:58 +00:00
parent 7ce2935ef1
commit dfd6ef2109

View File

@@ -6,7 +6,7 @@ name: Matrix Chatbot Interface for Story Kit
## User Story
As a developer, I want to interact with Story Kit through a Matrix chat room so that I can create stories, assign agents, and monitor pipeline progress from any Matrix client (Element, Element X, mobile) without needing the web UI open.
As a developer, I want to interact with Story Kit through a Matrix chat room so that I can create stories, assign agents, and monitor pipeline progress conversationally from any Matrix client (Element, Element X, mobile) without needing the web UI open.
## Background
@@ -26,41 +26,46 @@ Matrix is the right platform because:
## Architecture
```
Matrix (Conduit) <-> Story Kit Server (matrix-sdk built in)
|
AppContext, AgentPool, watcher_tx (direct access)
Matrix Room
|
v
Story Kit Server
|-- matrix module (matrix-sdk) -- receives messages, posts updates
|-- LLM (Anthropic API) -------- interprets intent, decides actions
|-- MCP tools ------------------- create_story, start_agent, list_agents, etc.
|-- watcher_tx ------------------ pipeline events pushed to room
```
The Matrix bot is built into the server process as a module (`server/src/matrix/`). It:
The bot is an LLM agent with access to Story Kit's MCP tools, using Matrix as its transport. Users talk naturally — "we need a dark mode feature", "what's stuck?", "put a coder on 42" — and the LLM interprets intent and calls the appropriate tools. No special command syntax needed.
The Matrix module is built into the server process (`server/src/matrix/`). It:
1. Connects to the Matrix homeserver as a bot user on server startup
2. Joins configured room(s)
3. Listens for messages (commands)
4. Calls internal functions directly (no HTTP round-trip to itself)
5. Subscribes to `watcher_tx` broadcast channel for live pipeline updates
6. Posts updates back to the room
3. Passes incoming messages to an LLM with Story Kit MCP tools available
4. Posts LLM responses back to the room
5. Subscribes to `watcher_tx` broadcast channel and posts live pipeline updates
Benefits of building it in:
- Direct access to `AppContext`, `AgentPool`, pipeline state — no self-referential HTTP calls
- Direct access to `AppContext`, `AgentPool`, pipeline state
- Subscribes to existing broadcast channels (`watcher_tx`, `reconciliation_tx`) for live events
- Single process to manage — no "where is the server?" configuration
- Can restart cleanly with the server
- Single process to manage
- MCP tools already exist — the LLM uses the same tools that CLI agents use
## Acceptance Criteria
### Phase 1: Core Bot Infrastructure
### Phase 1: Matrix Connection
- [ ] New `server/src/matrix/` module that connects to a Matrix homeserver using `matrix-sdk`
- [ ] Bot reads configuration from `.story_kit/bot.toml` (homeserver URL, bot user credentials, room ID(s))
- [ ] Bot connection is optional — server starts normally if `bot.toml` is missing or Matrix is disabled
- [ ] Bot joins configured room(s) on startup
- [ ] Bot responds to a `!status` command with current pipeline state (counts per stage)
- [ ] Bot responds to `!pipeline` with a formatted list of all stories across all stages
- [ ] Bot ignores its own messages (no echo loops)
### Phase 2: Story Management
- [ ] `!create story <title>` creates a new story in `1_upcoming/`
- [ ] `!create bug <title>` creates a new bug in `1_upcoming/`
- [ ] `!assign <story_id>` starts a coder agent on the given story
- [ ] `!stop <story_id>` stops the agent working on the given story
- [ ] `!agents` lists all agents and their current status
### Phase 2: LLM-Powered Conversation
- [ ] Incoming room messages are passed to an LLM (Anthropic API) with Story Kit MCP tools
- [ ] The LLM can call MCP tools to answer questions and take actions (create stories, assign agents, check pipeline status, etc.)
- [ ] LLM responses are posted back to the room as the bot user
- [ ] Conversation context is maintained per-room (the bot remembers recent messages)
- [ ] Bot handles multiple rooms independently
### Phase 3: Live Updates
- [ ] Bot subscribes to `watcher_tx` broadcast channel and posts pipeline changes to the room
@@ -69,12 +74,12 @@ Benefits of building it in:
- [ ] Messages are concise and formatted for readability (not noisy)
## Out of Scope
- Natural language understanding (LLM-powered intent parsing) — commands are explicit for now
- E2EE (can be added later, start with unencrypted room)
- Multi-project support (single project per bot instance)
- Distributed multi-node coordination (future story)
- Web UI changes
- Permission/auth model for who can run commands
- Voice messages or media handling
## Technical Notes
- Use `matrix-sdk` crate for Matrix client
@@ -82,10 +87,12 @@ Benefits of building it in:
- Bot receives `Arc<AppContext>` (or relevant sub-fields) at startup to access internals directly
- Configuration in `.story_kit/bot.toml` keeps bot config alongside project config
- Bot spawns as a `tokio::spawn` task from `main.rs`, similar to the watcher and reaper tasks
- LLM calls use the same Anthropic API path the server already uses for the web UI chat
- MCP tool definitions are already registered at `POST /mcp` — the LLM can use the same tool schemas
## Future Considerations
- LLM-powered natural language commands ("put a coder on the auth story")
- Thread-based story discussions (Matrix threads per story)
- Code review in-chat (show diffs, approve/reject)
- Distributed mode: a separate coordinator bot that sits above multiple Story Kit nodes, farms out work based on node capacity, and aggregates status. Each node keeps its built-in bot as a local control interface the coordinator talks to.
- Bridge to Signal/WhatsApp via Matrix bridges
- Bot personality/tone configuration