From 97a249d9a23bf40e0ab53d0a9e8d119a4b19c28e Mon Sep 17 00:00:00 2001 From: Dave Date: Tue, 24 Feb 2026 15:07:30 +0000 Subject: [PATCH] story-kit: create 152_bug_ollama_not_running_kills_the_entire_web_ui --- ...ama_not_running_kills_the_entire_web_ui.md | 29 +++++++++++++++++++ 1 file changed, 29 insertions(+) create mode 100644 .story_kit/work/1_upcoming/152_bug_ollama_not_running_kills_the_entire_web_ui.md diff --git a/.story_kit/work/1_upcoming/152_bug_ollama_not_running_kills_the_entire_web_ui.md b/.story_kit/work/1_upcoming/152_bug_ollama_not_running_kills_the_entire_web_ui.md new file mode 100644 index 0000000..0d7964a --- /dev/null +++ b/.story_kit/work/1_upcoming/152_bug_ollama_not_running_kills_the_entire_web_ui.md @@ -0,0 +1,29 @@ +--- +name: "Ollama not running kills the entire web UI" +--- + +# Bug 152: Ollama not running kills the entire web UI + +## Description + +The UI fetches Ollama models on load via /api/ollama/models (server/src/http/model.rs line 40). When Ollama is not running, the request fails and the error propagates in a way that kills the whole UI. + +The server endpoint at server/src/http/model.rs should return an empty list instead of an error when Ollama is unreachable. Or the frontend should catch the error gracefully and just show no Ollama models in the dropdown. + +## How to Reproduce + +1. Stop Ollama (or never start it) +2. Open the web UI +3. Observe error: Request failed: error sending request for url (http://localhost:11434/api/tags) + +## Actual Result + +The entire web UI is broken. Nothing works. + +## Expected Result + +Ollama model fetch should fail silently or show an empty model list. The rest of the UI should work normally with Claude Code or Anthropic providers. + +## Acceptance Criteria + +- [ ] Bug is fixed and verified