story-kit: done 245_bug_chat_history_persistence_lost_on_page_refresh_story_145_regression
This commit is contained in:
@@ -1,11 +1,14 @@
|
|||||||
---
|
---
|
||||||
name: "Chat history persistence lost on page refresh (story 145 regression)"
|
name: "Chat history persistence lost on page refresh (story 145 regression)"
|
||||||
|
agent: coder-opus
|
||||||
---
|
---
|
||||||
|
|
||||||
## Rejection Notes
|
## Rejection Notes
|
||||||
|
|
||||||
**2026-03-16:** Previous coder produced zero code changes — feature branch had no diff against master. The coder must actually use `git bisect` to find the breaking commit and produce a surgical fix. Do not submit with no code changes.
|
**2026-03-16:** Previous coder produced zero code changes — feature branch had no diff against master. The coder must actually use `git bisect` to find the breaking commit and produce a surgical fix. Do not submit with no code changes.
|
||||||
|
|
||||||
|
**2026-03-17:** Re-opened. Multiple fix attempts have failed. See investigation notes below for the actual root cause.
|
||||||
|
|
||||||
# Bug 245: Chat history persistence lost on page refresh (story 145 regression)
|
# Bug 245: Chat history persistence lost on page refresh (story 145 regression)
|
||||||
|
|
||||||
## Description
|
## Description
|
||||||
@@ -16,21 +19,50 @@ Story 145 implemented localStorage persistence for chat history across page relo
|
|||||||
|
|
||||||
1. Open the web UI and have a conversation with the agent
|
1. Open the web UI and have a conversation with the agent
|
||||||
2. Refresh the page (F5 or Cmd+R)
|
2. Refresh the page (F5 or Cmd+R)
|
||||||
|
3. Send a new message
|
||||||
|
4. The LLM has no knowledge of the prior conversation
|
||||||
|
|
||||||
## Actual Result
|
## Actual Result
|
||||||
|
|
||||||
Chat history is gone after refresh — the UI shows a blank conversation.
|
Chat history is gone after refresh — the UI shows a blank conversation. Even if messages appear in the UI (loaded from localStorage), the LLM does not receive them as context on the next exchange.
|
||||||
|
|
||||||
## Expected Result
|
## Expected Result
|
||||||
|
|
||||||
Chat history is restored from localStorage on page load, as implemented in story 145.
|
Chat history is restored from localStorage on page load, as implemented in story 145. The LLM should receive the full conversation history when the user sends a new message after refresh.
|
||||||
|
|
||||||
## Acceptance Criteria
|
## Acceptance Criteria
|
||||||
|
|
||||||
- [ ] Chat messages survive a full page refresh
|
- [ ] Chat messages survive a full page refresh (visible in UI)
|
||||||
- [ ] Chat messages are restored from localStorage on component mount
|
- [ ] Chat messages are restored from localStorage on component mount
|
||||||
|
- [ ] After refresh, the LLM receives full prior conversation history as context when the user sends the next message
|
||||||
- [ ] Behaviour matches the original acceptance criteria from story 145
|
- [ ] Behaviour matches the original acceptance criteria from story 145
|
||||||
|
|
||||||
## Investigation Notes
|
## Investigation Notes (2026-03-17)
|
||||||
|
|
||||||
**Use `git bisect` to find the commit that broke this.** Story 145 delivered working localStorage persistence — something after that regressed it. Find the breaking commit, understand the root cause, and fix it there. Do NOT layer on a new implementation. Revert or surgically fix the regression.
|
### Root cause analysis
|
||||||
|
|
||||||
|
The frontend correctly:
|
||||||
|
1. Persists messages to localStorage in `useChatHistory.ts` (key: `storykit-chat-history:{projectPath}`)
|
||||||
|
2. Loads them on mount
|
||||||
|
3. Sends the FULL history array to the backend via `wsRef.current?.sendChat(newHistory, config)` in `Chat.tsx` line ~558
|
||||||
|
|
||||||
|
The backend bug is in `server/src/llm/chat.rs`:
|
||||||
|
- The `chat()` function receives the full `messages: Vec<Message>` from the client
|
||||||
|
- Line ~283: `let mut current_history = messages.clone()` — correctly clones full history
|
||||||
|
- Lines ~299-318: Adds 2 system prompts at position 0 and 1
|
||||||
|
- Lines ~323-404: Main LLM loop generates new assistant/tool messages
|
||||||
|
- **Line ~407: `ChatResult { messages: new_messages }` — BUG: returns ONLY the newly generated turn, not the full `current_history`**
|
||||||
|
|
||||||
|
During streaming, the `on_update()` callbacks DO send `current_history[2..]` (full history minus system prompts), which is correct. But there may be a reconciliation issue on the frontend where the final state doesn't include the full history.
|
||||||
|
|
||||||
|
### Key files
|
||||||
|
- `frontend/src/hooks/useChatHistory.ts` — localStorage persistence
|
||||||
|
- `frontend/src/components/Chat.tsx` — sends full history, handles `onUpdate` callbacks
|
||||||
|
- `frontend/src/api/client.ts` — WebSocket client
|
||||||
|
- `server/src/http/ws.rs` — WebSocket handler, passes messages to chat()
|
||||||
|
- `server/src/llm/chat.rs` — **THE BUG** at line ~407, ChatResult returns only new_messages
|
||||||
|
|
||||||
|
### What NOT to do
|
||||||
|
- Do NOT layer on a new localStorage implementation. The localStorage code works fine.
|
||||||
|
- Do NOT add server-side persistence. The "dumb pipe" architecture is correct.
|
||||||
|
- The fix should be surgical — ensure the full conversation history round-trips correctly through the backend.
|
||||||
|
|||||||
Reference in New Issue
Block a user