moved from tauri to a server with embedded UI

This commit is contained in:
Dave
2026-02-13 12:31:36 +00:00
parent d4203cfaab
commit 0876c53e17
79 changed files with 5755 additions and 10655 deletions

View File

@@ -172,6 +172,10 @@ If a user hands you this document and says "Apply this process to my project":
**MANDATORY:** Before completing Step 4 (Verification) of any story, you MUST run all applicable linters and fix ALL errors and warnings. Zero tolerance for warnings or errors.
**AUTO-RUN CHECKS:** Always run the required lint/test/build checks as soon as relevant changes are made. Do not ask for permission to run them—run them automatically and fix any failures.
**ALWAYS FIX DIAGNOSTICS:** At every stage, you must proactively fix all errors and warnings without waiting for user confirmation. Do not pause to ask whether to fix diagnostics—fix them immediately as part of the workflow.
### TypeScript/JavaScript: Biome
* **Tool:** [Biome](https://biomejs.dev/) - Fast formatter and linter

View File

@@ -1,7 +1,7 @@
# Project Context
## High-Level Goal
To build a standalone **Agentic AI Code Assistant** application using Tauri. The assistant will facilitate a "Story-Driven Spec Workflow" (SDSW) for software development. Unlike a passive chat interface, this assistant acts as an **Agent**, capable of using tools to read the filesystem, execute shell commands, manage git repositories, and modify code directly to implement features.
To build a standalone **Agentic AI Code Assistant** application as a single Rust binary that serves a Vite/React web UI and exposes a WebSocket API. The assistant will facilitate a "Story-Driven Spec Workflow" (SDSW) for software development. Unlike a passive chat interface, this assistant acts as an **Agent**, capable of using tools to read the filesystem, execute shell commands, manage git repositories, and modify code directly to implement features.
## Core Features
1. **Chat Interface:** A conversational UI for the user to interact with the AI assistant.
@@ -28,6 +28,6 @@ To build a standalone **Agentic AI Code Assistant** application using Tauri. The
## Glossary
* **SDSW:** Story-Driven Spec Workflow.
* **Tauri:** The framework used to build this assistant (Rust backend + Web frontend).
* **Web Server Binary:** The Rust binary that serves the Vite/React frontend and exposes the WebSocket API.
* **Living Spec:** The collection of Markdown files in `.living_spec/` that define the project.
* **Tool Call:** A structured request from the LLM to execute a specific native function.

View File

@@ -1,12 +1,12 @@
# Tech Stack & Constraints
## Overview
This project is a desktop application built with **Tauri**. It functions as an **Agentic Code Assistant** capable of safely executing tools on the host system.
This project is a standalone Rust **web server binary** that serves a Vite/React frontend and exposes a **WebSocket API**. The built frontend assets are packaged with the binary (in a `frontend` directory) and served as static files. It functions as an **Agentic Code Assistant** capable of safely executing tools on the host system.
## Core Stack
* **Backend:** Rust (Tauri Core)
* **Backend:** Rust (Web Server)
* **MSRV:** Stable (latest)
* **Framework:** Tauri v2
* **Framework:** Poem HTTP server with WebSocket support for streaming; HTTP APIs should use Poem OpenAPI (Swagger) for non-streaming endpoints.
* **Frontend:** TypeScript + React
* **Build Tool:** Vite
* **Styling:** CSS Modules or Tailwind (TBD - Defaulting to CSS Modules)
@@ -17,12 +17,12 @@ This project is a desktop application built with **Tauri**. It functions as an *
The application follows a **Tool-Use (Function Calling)** architecture:
1. **Frontend:** Collects user input and sends it to the LLM.
2. **LLM:** Decides to generate text OR request a **Tool Call** (e.g., `execute_shell`, `read_file`).
3. **Tauri Backend (The "Hand"):**
3. **Web Server Backend (The "Hand"):**
* Intercepts Tool Calls.
* Validates the request against the **Safety Policy**.
* Executes the native code (File I/O, Shell Process, Search).
* Returns the output (stdout/stderr/file content) to the LLM.
* **Event Loop:** The backend emits real-time events (`chat:update`) to the frontend to ensure UI responsiveness during long-running Agent tasks.
* **Streaming:** The backend sends real-time updates over WebSocket to keep the UI responsive during long-running Agent tasks.
## LLM Provider Abstraction
To support both Remote and Local models, the system implements a `ModelProvider` abstraction layer.
@@ -39,8 +39,7 @@ To support both Remote and Local models, the system implements a `ModelProvider`
* Otherwise → Ollama
* Single unified model dropdown with section headers ("Anthropic", "Ollama")
* **API Key Management:**
* Anthropic API key stored in OS keychain (macOS Keychain, Windows Credential Manager, Linux Secret Service)
* Uses `keyring` crate for cross-platform secure storage
* Anthropic API key stored server-side and persisted securely
* On first use of Claude model, user prompted to enter API key
* Key persists across sessions (no re-entry needed)
@@ -98,15 +97,11 @@ To support both Remote and Local models, the system implements a `ModelProvider`
* `tokio`: Async runtime.
* `reqwest`: For LLM API calls (Anthropic, Ollama).
* `eventsource-stream`: For Server-Sent Events (Anthropic streaming).
* `keyring`: Secure API key storage in OS keychain.
* `uuid`: For unique message IDs.
* `chrono`: For timestamps.
* `tauri-plugin-dialog`: Native system dialogs.
* `tauri-plugin-store`: Persistent key-value storage.
* `poem`: HTTP server framework.
* `poem-openapi`: OpenAPI (Swagger) for non-streaming HTTP APIs.
* **JavaScript:**
* `@tauri-apps/api`: Tauri Bridge.
* `@tauri-apps/plugin-dialog`: Dialog API.
* `@tauri-apps/plugin-store`: Store API.
* `react-markdown`: For rendering chat responses.
## Safety & Sandbox

View File

@@ -0,0 +1,23 @@
# Story 01: Replace Tauri with Browser UI Served by Rust Binary
## User Story
As a user, I want to run a single Rust binary that serves the web UI and exposes a WebSocket API, so I can use the app in my browser without installing a desktop shell.
## Acceptance Criteria
- The app runs as a single Rust binary that:
- Serves the built frontend assets from a `frontend` directory.
- Exposes a WebSocket endpoint for chat streaming and tool execution.
- The browser UI uses the WebSocket API for:
- Sending chat messages.
- Receiving streaming token updates and final chat history updates.
- Requesting file operations, search, and shell execution.
- The project selection UI uses a browser file picker (not native OS dialogs).
- Model preference and last project selection are persisted server-side (no Tauri store).
- The Tauri backend and configuration are removed from the build pipeline.
- The frontend remains a Vite/React build and is served as static assets by the Rust binary.
## Out of Scope
- Reworking the LLM provider implementations beyond wiring changes.
- Changing the UI layout/visual design.
- Adding authentication or multi-user support.
- Switching away from Vite for frontend builds.