story-kit: queue 152_bug_ollama_not_running_kills_the_entire_web_ui for QA
This commit is contained in:
@@ -1,29 +0,0 @@
|
||||
---
|
||||
name: "Ollama not running kills the entire web UI"
|
||||
---
|
||||
|
||||
# Bug 152: Ollama not running kills the entire web UI
|
||||
|
||||
## Description
|
||||
|
||||
The UI fetches Ollama models on load via /api/ollama/models (server/src/http/model.rs line 40). When Ollama is not running, the request fails and the error propagates in a way that kills the whole UI.
|
||||
|
||||
The server endpoint at server/src/http/model.rs should return an empty list instead of an error when Ollama is unreachable. Or the frontend should catch the error gracefully and just show no Ollama models in the dropdown.
|
||||
|
||||
## How to Reproduce
|
||||
|
||||
1. Stop Ollama (or never start it)
|
||||
2. Open the web UI
|
||||
3. Observe error: Request failed: error sending request for url (http://localhost:11434/api/tags)
|
||||
|
||||
## Actual Result
|
||||
|
||||
The entire web UI is broken. Nothing works.
|
||||
|
||||
## Expected Result
|
||||
|
||||
Ollama model fetch should fail silently or show an empty model list. The rest of the UI should work normally with Claude Code or Anthropic providers.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Bug is fixed and verified
|
||||
Reference in New Issue
Block a user