feat: agent brain (ollama) and chat ui
This commit is contained in:
20
.living_spec/stories/archive/02_core_agent_tools.md
Normal file
20
.living_spec/stories/archive/02_core_agent_tools.md
Normal file
@@ -0,0 +1,20 @@
|
||||
# Story: Core Agent Tools (The Hands)
|
||||
|
||||
## User Story
|
||||
**As an** Agent
|
||||
**I want to** be able to read files, list directories, search content, and execute shell commands
|
||||
**So that** I can autonomously explore and modify the target project.
|
||||
|
||||
## Acceptance Criteria
|
||||
* [ ] Rust Backend: Implement `read_file(path)` command (scoped to project).
|
||||
* [ ] Rust Backend: Implement `write_file(path, content)` command (scoped to project).
|
||||
* [ ] Rust Backend: Implement `list_directory(path)` command.
|
||||
* [ ] Rust Backend: Implement `exec_shell(command, args)` command.
|
||||
* [ ] Must enforce allowlist (git, cargo, npm, etc).
|
||||
* [ ] Must run in project root.
|
||||
* [ ] Rust Backend: Implement `search_files(query, globs)` using `ignore` crate.
|
||||
* [ ] Frontend: Expose these as tools to the (future) LLM interface.
|
||||
|
||||
## Out of Scope
|
||||
* The LLM Chat UI itself (connecting these to a visual chat window comes later).
|
||||
* Complex git merges (simple commands only).
|
||||
22
.living_spec/stories/archive/03_llm_ollama.md
Normal file
22
.living_spec/stories/archive/03_llm_ollama.md
Normal file
@@ -0,0 +1,22 @@
|
||||
# Story: The Agent Brain (Ollama Integration)
|
||||
|
||||
## User Story
|
||||
**As a** User
|
||||
**I want to** connect the Assistant to a local Ollama instance
|
||||
**So that** I can chat with the Agent and have it execute tools without sending data to the cloud.
|
||||
|
||||
## Acceptance Criteria
|
||||
* [ ] Backend: Implement `ModelProvider` trait/interface.
|
||||
* [ ] Backend: Implement `OllamaProvider` (POST /api/chat).
|
||||
* [ ] Backend: Implement `chat(message, history, provider_config)` command.
|
||||
* [ ] Must support passing Tool Definitions to Ollama (if model supports it) or System Prompt instructions.
|
||||
* [ ] Must parse Tool Calls from the response.
|
||||
* [ ] Frontend: Settings Screen to toggle "Ollama" and set Model Name (default: `llama3`).
|
||||
* [ ] Frontend: Chat Interface.
|
||||
* [ ] Message History (User/Assistant).
|
||||
* [ ] Tool Call visualization (e.g., "Running git status...").
|
||||
|
||||
## Out of Scope
|
||||
* Remote Providers (Anthropic/OpenAI) - Future Story.
|
||||
* Streaming responses (wait for full completion for MVP).
|
||||
* Complex context window management (just send full history for now).
|
||||
Reference in New Issue
Block a user