feat: agent brain (ollama) and chat ui

This commit is contained in:
Dave
2025-12-24 17:17:35 +00:00
parent 76e03bc1a2
commit d9cd16601b
18 changed files with 1712 additions and 14 deletions

View File

@@ -87,7 +87,8 @@ When the LLM context window fills up (or the chat gets slow/confused):
If a user hands you this document and says "Apply this process to my project":
1. **Analyze the Request:** Ask for the high-level goal ("What are we building?") and the tech preferences ("Rust or Python?").
2. **Scaffold:** Run commands to create the `specs/` and `stories/` folders.
3. **Draft Context:** Write `specs/00_CONTEXT.md` based on the user's answer.
4. **Draft Stack:** Write `specs/tech/STACK.md` based on best practices for that language.
5. **Wait:** Ask the user for "Story #1".
2. **Git Check:** Check if the directory is a git repository (`git status`). If not, run `git init`.
3. **Scaffold:** Run commands to create the `specs/` and `stories/` folders.
4. **Draft Context:** Write `specs/00_CONTEXT.md` based on the user's answer.
5. **Draft Stack:** Write `specs/tech/STACK.md` based on best practices for that language.
6. **Wait:** Ask the user for "Story #1".