From 4e2d7416e8bf67f05fa7530de6334352e5d8bae0 Mon Sep 17 00:00:00 2001 From: Dave Date: Wed, 24 Dec 2025 17:34:22 +0000 Subject: [PATCH] Added some more stories for later --- .../stories/05_persist_project_selection.md | 16 ++++++++++++++++ .../stories/06_fix_ui_responsiveness.md | 19 +++++++++++++++++++ 2 files changed, 35 insertions(+) create mode 100644 .living_spec/stories/05_persist_project_selection.md create mode 100644 .living_spec/stories/06_fix_ui_responsiveness.md diff --git a/.living_spec/stories/05_persist_project_selection.md b/.living_spec/stories/05_persist_project_selection.md new file mode 100644 index 0000000..4edf02e --- /dev/null +++ b/.living_spec/stories/05_persist_project_selection.md @@ -0,0 +1,16 @@ +# Story: Persist Project Selection + +## User Story +**As a** User +**I want** the application to remember the last project I opened +**So that** I don't have to re-select the directory every time I restart the app. + +## Acceptance Criteria +* [ ] Backend: Use `tauri-plugin-store` (or simple JSON file) to persist `last_project_path`. +* [ ] Backend: On app startup, check if a saved path exists. +* [ ] Backend: If saved path exists and is valid, automatically load it into `SessionState`. +* [ ] Frontend: On load, check if backend has a project ready. If so, skip selection screen. +* [ ] Frontend: Add a "Close Project" button to clear the state and return to selection screen. + +## Out of Scope +* Managing a list of "Recent Projects" (just the last one is fine for now). diff --git a/.living_spec/stories/06_fix_ui_responsiveness.md b/.living_spec/stories/06_fix_ui_responsiveness.md new file mode 100644 index 0000000..b6988ba --- /dev/null +++ b/.living_spec/stories/06_fix_ui_responsiveness.md @@ -0,0 +1,19 @@ +# Story: Fix UI Responsiveness (Tech Debt) + +## User Story +**As a** User +**I want** the UI to remain interactive and responsive while the Agent is thinking or executing tools +**So that** I don't feel like the application has crashed. + +## Context +Currently, the UI locks up or becomes unresponsive during long LLM generations or tool executions. Even though the backend commands are async, the frontend experience degrades. + +## Acceptance Criteria +* [ ] Investigate the root cause of the freezing (JS Main Thread blocking vs. Tauri IPC blocking). +* [ ] Implement a "Streaming" architecture for Chat if necessary (getting partial tokens instead of waiting for full response). + * *Note: This might overlap with future streaming stories, but basic responsiveness is the priority here.* +* [ ] Add visual indicators (Spinner/Progress Bar) that animate smoothly during the wait. +* [ ] Ensure the "Stop Generation" button (if added) can actually interrupt the backend task. + +## Out of Scope +* Full streaming text (unless that is the only way to fix the freezing).