Files
storkit/.living_spec/stories/archive/03_llm_ollama.md
2025-12-24 17:17:35 +00:00

1.0 KiB

Story: The Agent Brain (Ollama Integration)

User Story

As a User I want to connect the Assistant to a local Ollama instance So that I can chat with the Agent and have it execute tools without sending data to the cloud.

Acceptance Criteria

  • Backend: Implement ModelProvider trait/interface.
  • Backend: Implement OllamaProvider (POST /api/chat).
  • Backend: Implement chat(message, history, provider_config) command.
    • Must support passing Tool Definitions to Ollama (if model supports it) or System Prompt instructions.
    • Must parse Tool Calls from the response.
  • Frontend: Settings Screen to toggle "Ollama" and set Model Name (default: llama3).
  • Frontend: Chat Interface.
    • Message History (User/Assistant).
    • Tool Call visualization (e.g., "Running git status...").

Out of Scope

  • Remote Providers (Anthropic/OpenAI) - Future Story.
  • Streaming responses (wait for full completion for MVP).
  • Complex context window management (just send full history for now).