feat: agent brain (ollama) and chat ui
This commit is contained in:
@@ -87,7 +87,8 @@ When the LLM context window fills up (or the chat gets slow/confused):
|
||||
If a user hands you this document and says "Apply this process to my project":
|
||||
|
||||
1. **Analyze the Request:** Ask for the high-level goal ("What are we building?") and the tech preferences ("Rust or Python?").
|
||||
2. **Scaffold:** Run commands to create the `specs/` and `stories/` folders.
|
||||
3. **Draft Context:** Write `specs/00_CONTEXT.md` based on the user's answer.
|
||||
4. **Draft Stack:** Write `specs/tech/STACK.md` based on best practices for that language.
|
||||
5. **Wait:** Ask the user for "Story #1".
|
||||
2. **Git Check:** Check if the directory is a git repository (`git status`). If not, run `git init`.
|
||||
3. **Scaffold:** Run commands to create the `specs/` and `stories/` folders.
|
||||
4. **Draft Context:** Write `specs/00_CONTEXT.md` based on the user's answer.
|
||||
5. **Draft Stack:** Write `specs/tech/STACK.md` based on best practices for that language.
|
||||
6. **Wait:** Ask the user for "Story #1".
|
||||
|
||||
Reference in New Issue
Block a user