Story 22: Implement smart auto-scroll that respects user scrolling

User Story:
As a user, I want to be able to scroll up to review previous messages
while the AI is streaming or adding new content, without being
constantly dragged back to the bottom.

Implementation:
- Replaced position-based threshold detection with user-intent tracking
- Detects when user scrolls UP and disables auto-scroll completely
- Auto-scroll only re-enables when user manually returns to bottom (<5px)
- Uses refs to track scroll position and direction for smooth operation
- Works seamlessly during rapid token streaming and tool execution

Technical Details:
- lastScrollTopRef: Tracks previous scroll position to detect direction
- userScrolledUpRef: Flag set when upward scrolling is detected
- Direct scrollTop manipulation for instant, non-fighting scroll behavior
- Threshold of 5px from absolute bottom to re-enable auto-scroll

Spec Updates:
- Added comprehensive Smart Auto-Scroll section to UI_UX.md
- Documented the problem, solution, requirements, and implementation
- Includes code examples and edge case handling

Acceptance Criteria Met:
 Auto-scroll disabled when scrolling up
 Auto-scroll resumes when returning to bottom
 Works normally when already at bottom
 Smooth detection without flickering
 Works during streaming and tool execution

Files Changed:
- src/components/Chat.tsx: Implemented user-intent tracking
- .living_spec/specs/functional/UI_UX.md: Added Smart Auto-Scroll spec
- .living_spec/stories/22_smart_autoscroll.md: Marked complete
This commit is contained in:
Dave
2025-12-27 19:21:34 +00:00
parent 1baf3fa728
commit 57826dc5ee
4 changed files with 150 additions and 8 deletions

View File

@@ -0,0 +1,36 @@
# Story 23: Alphabetize LLM Dropdown List
## User Story
As a user, I want the LLM model dropdown to be alphabetically sorted so I can quickly find the model I'm looking for.
## Acceptance Criteria
- [ ] The model dropdown list is sorted alphabetically (case-insensitive)
- [ ] The currently selected model remains selected after sorting
- [ ] The sorting works for all models returned from Ollama
- [ ] The sorted list updates correctly when models are added/removed
## Out of Scope
- Grouping models by type or provider
- Custom sort orders (e.g., by popularity, recency)
- Search/filter functionality in the dropdown
- Favoriting or pinning specific models to the top
## Technical Notes
- Models are fetched from `get_ollama_models` Tauri command
- Currently displayed in the order returned by the backend
- Sort should be case-insensitive (e.g., "Llama" and "llama" treated equally)
- JavaScript's `sort()` with `localeCompare()` is ideal for this
## Implementation Approach
```tsx
// After fetching models from backend
const sortedModels = models.sort((a, b) =>
a.toLowerCase().localeCompare(b.toLowerCase())
);
setAvailableModels(sortedModels);
```
## Design Considerations
- Keep it simple - alphabetical order is intuitive
- Case-insensitive to handle inconsistent model naming
- No need to change backend - sorting on frontend is sufficient