story-kit: start 174_story_matrix_chatbot_interface_for_story_kit

This commit is contained in:
Dave
2026-02-25 12:17:44 +00:00
parent 11ef39b05c
commit dc6e2434e7

View File

@@ -0,0 +1,71 @@
---
name: Matrix Bot with LLM Conversation
---
# Matrix Bot with LLM Conversation
## User Story
As a developer, I want to talk to Story Kit through a Matrix chat room so that I can create stories, assign agents, and manage the pipeline conversationally from any Matrix client (Element, Element X, mobile).
## Background
Story Kit currently requires the web UI or direct file manipulation to manage the pipeline. A Matrix bot built into the server provides a conversational interface powered by an LLM with access to Story Kit's MCP tools. Users talk naturally — "we need a dark mode feature", "what's stuck?", "put a coder on 42" — and the LLM interprets intent and calls the appropriate tools.
Matrix is the right platform because:
- Self-hosted (Conduit already running)
- Proper bot API (appservice or client SDK)
- E2EE support
- Bridges to Signal/WhatsApp for free
## Architecture
```
Matrix Room
|
v
Story Kit Server
|-- matrix module (matrix-sdk) -- receives messages, posts responses
|-- LLM (Anthropic API) -------- interprets intent, decides actions
|-- MCP tools ------------------- create_story, start_agent, list_agents, etc.
```
The Matrix module is built into the server process (`server/src/matrix/`). It:
1. Connects to the Matrix homeserver as a bot user on server startup
2. Joins a configured room
3. Passes incoming messages to an LLM with Story Kit MCP tools available
4. Posts LLM responses back to the room
Benefits of building it in:
- Direct access to `AppContext`, `AgentPool`, pipeline state
- Single process to manage
- MCP tools already exist — the LLM uses the same tools that CLI agents use
## Acceptance Criteria
- [ ] New `server/src/matrix/` module that connects to a Matrix homeserver using `matrix-sdk`
- [ ] Bot reads configuration from `.story_kit/bot.toml` (homeserver URL, bot user credentials, room ID)
- [ ] Bot connection is optional — server starts normally if `bot.toml` is missing or Matrix is disabled
- [ ] Bot joins configured room on startup
- [ ] Bot ignores its own messages (no echo loops)
- [ ] Incoming room messages are passed to an LLM (Anthropic API) with Story Kit MCP tools
- [ ] The LLM can call MCP tools to answer questions and take actions (create stories, assign agents, check pipeline status, etc.)
- [ ] LLM responses are posted back to the room as the bot user
## Out of Scope
- Conversation context / message history (see story 182)
- Live pipeline update feed (see story 181)
- Multi-room support (see story 182)
- E2EE (can be added later)
- Distributed multi-node coordination
- Web UI changes
- Permission/auth model for who can run commands
## Technical Notes
- Use `matrix-sdk` crate for Matrix client
- Module lives at `server/src/matrix/` (mod.rs + submodules as needed)
- Bot receives `Arc<AppContext>` (or relevant sub-fields) at startup to access internals directly
- Configuration in `.story_kit/bot.toml` keeps bot config alongside project config
- Bot spawns as a `tokio::spawn` task from `main.rs`, similar to the watcher and reaper tasks
- LLM calls use the same Anthropic API path the server already uses for the web UI chat
- MCP tool definitions are already registered at `POST /mcp` — the LLM can use the same tool schemas