1.5 KiB
1.5 KiB
Functional Spec: UI/UX Responsiveness
Problem
Currently, the chat command in Rust is an async function that performs a long-running, blocking loop (waiting for LLM, executing tools). While Tauri executes this on a separate thread from the UI, the frontend awaits the entire result before re-rendering. This makes the app feel "frozen" because there is no feedback during the 10-60 seconds of generation.
Solution: Event-Driven Feedback
Instead of waiting for the final array of messages, the Backend should emit Events to the Frontend in real-time.
1. Events
chat:token: Emitted when a text token is generated (Streaming text).chat:tool-start: Emitted when a tool call begins (e.g.,{ tool: "git status" }).chat:tool-end: Emitted when a tool call finishes (e.g.,{ output: "..." }).
2. Implementation Strategy (MVP)
For this story, we won't fully implement token streaming (as reqwest blocking/async mixed with stream parsing is complex). We will focus on State Updates:
- Refactor
chatcommand:- Instead of returning
Vec<Message>at the very end, it accepts aAppHandle. - Inside the loop, after every step (LLM response, Tool Execution), emit an event
chat:updatecontaining the current partial history. - The Frontend listens to
chat:updateand re-renders immediately.
- Instead of returning
3. Visuals
- Loading State: The "Send" button should show a spinner or "Stop" button.
- Auto-Scroll: The chat view should stick to the bottom as new events arrive.