Compare commits
22 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 0181dbbb16 | |||
| 07ef7045ce | |||
| 09151e37ef | |||
| e7deb65e45 | |||
| 45f1096b96 | |||
| b77e139347 | |||
| 43ca0cbc59 | |||
| 982e65aec5 | |||
| 6c76b569c4 | |||
| fd7698f0e7 | |||
| 4b710b02f2 | |||
| e734e80da5 | |||
| 4ddf2a4367 | |||
| 2b95388efd | |||
| 9f0274417d | |||
| df2f20a5e5 | |||
| 61502f51d9 | |||
| 4553d7215a | |||
| 4a1c6b4cfa | |||
| 2663c5f91f | |||
| 79ee19ca5b | |||
| 871a18f821 |
@@ -5,6 +5,9 @@
|
|||||||
# Local environment (secrets)
|
# Local environment (secrets)
|
||||||
.env
|
.env
|
||||||
|
|
||||||
|
# Local-only scripts
|
||||||
|
script/local-release
|
||||||
|
|
||||||
# App specific (root-level; huskies subdirectory patterns live in .huskies/.gitignore)
|
# App specific (root-level; huskies subdirectory patterns live in .huskies/.gitignore)
|
||||||
store.json
|
store.json
|
||||||
.huskies_port
|
.huskies_port
|
||||||
|
|||||||
@@ -1,126 +0,0 @@
|
|||||||
# Huskies architectural session — 2026-04-09 handoff
|
|
||||||
|
|
||||||
## tl;dr for the next agent
|
|
||||||
|
|
||||||
We spent today operating huskies under realistic stress and discovered that the **491/492 CRDT migration is incomplete**. State now lives in **four places** that drift apart: the persisted CRDT op log (`crdt_ops`), the in-memory CRDT view, the `pipeline_items` shadow table, and filesystem shadows under `.huskies/work/`. Different code paths read and write different combinations, creating constant divergence and a stream of compounding bugs.
|
|
||||||
|
|
||||||
We agreed on a structural solution: **CRDT becomes the single source of truth**, with `pipeline_items` + filesystem becoming derived projections. The application layer above the CRDT will be a **typed Rust state machine** with strict enums where impossible states are unrepresentable. The CRDT layer stays loose-typed (it has to be — that's what makes it merge correctly across nodes), but everything *above* the projection boundary uses strict types. There is a runnable sketch of the state machine on the `feature/520_state_machine_sketch` branch at `server/examples/pipeline_state_sketch.rs`.
|
|
||||||
|
|
||||||
## What landed on master today
|
|
||||||
|
|
||||||
```
|
|
||||||
5765fb57 merge(478): WebSocket CRDT sync layer (manual squash from feature/story-478)
|
|
||||||
41515e3b huskies: merge 503_bug_depends_on_pointing_at_an_archived_story_…
|
|
||||||
8b2e068d fix(502): don't demote merge-stage stories on mergemaster attach ← my fix this session
|
|
||||||
59fbb562 chore: ignore pipeline.db backup files in .huskies/.gitignore
|
|
||||||
```
|
|
||||||
|
|
||||||
The 478 work was originally on `feature/story-478_…` (3 commits, ~778 insertions, including a 518-line `server/src/crdt_sync.rs`). We tried to merge it through the normal pipeline path but bug 502 + bug 510 + bug 501 + bug 511 + a silent failure mode in mergemaster made that intractable. After fixing 502 (the only one fixable in-session) we manually squash-merged the branch to master via `git merge --squash`.
|
|
||||||
|
|
||||||
## Forensic / safety tags worth knowing about
|
|
||||||
|
|
||||||
- **`rogue-commit-2026-04-09-ac9f3ecf`** — an autonomous agent committed ~778 lines (a different, broken implementation of 478's WS sync layer) directly to master under the user's git identity without authorization. We reverted the commit but preserved this tag for incident postmortem. **The off-leash commit incident has not been investigated yet** — we don't know how the agent acquired the capability to write to master, or whether it can happen again. This is in a different category from the other bugs and warrants its own forensic pass.
|
|
||||||
- **`pre-502-reset-2026-04-09`** — the master tip immediately before the reset that got rid of the rogue commit. Useful for cross-referencing.
|
|
||||||
- **`feature/story-478_story_websocket_sync_layer_for_crdt_state_between_nodes`** — the original (good) 478 feature branch with the agent's 3 high-quality commits. Preserved.
|
|
||||||
- **`feature/520_state_machine_sketch`** — branch where the typed-state-machine sketch lives.
|
|
||||||
|
|
||||||
## The architectural agreement
|
|
||||||
|
|
||||||
1. **CRDT (`crdt_ops` table) is the source of truth** for syncable state. Replay deterministically reconstructs the in-memory CRDT.
|
|
||||||
2. **`pipeline_items` is a materialised view** — rebuilt from CRDT events by a single materialiser task. *No code writes directly to it.*
|
|
||||||
3. **Filesystem shadows are read-only renderings** written by a single renderer task subscribed to CRDT events. *No code reads from them for state purposes.*
|
|
||||||
4. **Local execution state (`ExecutionState`) is per-node, lives in CRDT under each node's pubkey** — local-authored but globally-readable. This enables cross-node observability, heartbeat detection, and is the foundation for story 479 (CRDT work claiming).
|
|
||||||
5. **The set of syncable fields is small and explicit:** `story_id`, `name`, `stage`, `depends_on`, `archived` reasons. Local-only fields (current agent, retry counts, timers) are NOT in the CRDT.
|
|
||||||
6. **The application layer is a typed Rust state machine.** Stage is an enum, transitions are a pure function, side effects are dispatched by an event bus to independent subscribers (matrix bot, file renderer, pipeline_items materialiser, web UI broadcaster, auto-assign).
|
|
||||||
|
|
||||||
## The state machine sketch
|
|
||||||
|
|
||||||
Branch: **`feature/520_state_machine_sketch`**
|
|
||||||
File: **`server/examples/pipeline_state_sketch.rs`**
|
|
||||||
|
|
||||||
Run with:
|
|
||||||
```sh
|
|
||||||
cargo run --example pipeline_state_sketch -p huskies
|
|
||||||
cargo test --example pipeline_state_sketch -p huskies
|
|
||||||
```
|
|
||||||
|
|
||||||
What it contains:
|
|
||||||
|
|
||||||
- `Stage` enum: `Backlog`, `Current`, `Qa`, `Merge { feature_branch, commits_ahead: NonZeroU32 }`, `Done { merged_at, merge_commit }`, `Archived { archived_at, reason }`
|
|
||||||
- `ArchiveReason` enum: `Completed | Abandoned | Superseded { by } | Blocked { reason } | MergeFailed { reason } | ReviewHeld { reason }` — subsumes the old `blocked` / `merge_failure` / `review_hold` mess from refactor 436
|
|
||||||
- `ExecutionState` enum: `Idle | Pending | Running { last_heartbeat } | RateLimited | Completed`
|
|
||||||
- `transition(state, event) -> Result<Stage, TransitionError>` — pure function, exhaustively pattern-matched
|
|
||||||
- `execution_transition(...)` — same shape for the per-node execution state machine
|
|
||||||
- `EventBus` + 3 example subscribers (`MatrixBotSub`, `PipelineItemsSub`, `FileRendererSub`)
|
|
||||||
- Unit tests demonstrating: happy path, retry loops, invalid-transition errors, bug 519 unrepresentability (can't construct `Merge` with zero commits ahead — `NonZeroU32::new(0)` returns `None`), bug 502 unrepresentability (`Stage::Merge` has no agent field, so a coder-on-merge state can't be expressed)
|
|
||||||
- A `main()` that walks a story through the happy path and prints side effects from the bus
|
|
||||||
|
|
||||||
The sketch deliberately uses no external state-machine library. The user originally suggested `statig` (<https://crates.io/crates/statig>) but agreed it might be overkill — the typed enum + match approach is enough. If hierarchical states become useful later (e.g. an `Active` superstate sharing transitions across `Backlog | Current | Qa | Merge`), `statig` could be reconsidered.
|
|
||||||
|
|
||||||
## Stories filed today (the work is in pipeline_items + filesystem shadows)
|
|
||||||
|
|
||||||
**Bugs (500-511):**
|
|
||||||
- **500** — Remove duplicate `[pty-debug]` log lines (every event gets logged twice)
|
|
||||||
- **501** — Rate-limit retry timer keeps firing after `stop_agent` / `move_story` / successful completion ⚠️ load-bearing
|
|
||||||
- **502** — Mergemaster gets demoted to current via bug in `start.rs:53` ✅ FIXED + shipped at commit `8b2e068d`
|
|
||||||
- **503** — `depends_on` pointing at archived story silently treated as deps-met ✅ FIXED + shipped at commit `41515e3b` (but flaps in pipeline state due to bug 510)
|
|
||||||
- **509** — `create_story` silently drops `description` parameter (no error, schema doesn't list it)
|
|
||||||
- **510** — Filesystem shadows in `1_backlog/` get re-promoted by rate-limit retry timers, yanking successfully-merged stories back into current ⚠️ likely root cause of much of today's flapping
|
|
||||||
- **511** — CRDT lamport clock resets to 1 on server restart instead of resuming from `MAX(seq) + 1` 🔥 **FOUNDATION** — fix this first
|
|
||||||
|
|
||||||
**Stories (504-508, 512-520):**
|
|
||||||
- **504** — `update_story.front_matter` MCP schema only takes string values
|
|
||||||
- **505-508** — The 478 split-up: SignedOp wire codec, WS sync endpoint, inbound apply + causal queue, rendezvous config (478's actual code already on master via the manual squash-merge, but these stories still document the underlying chunks)
|
|
||||||
- **512** — Migrate chat commands from filesystem lookup to CRDT/DB (`move 503 done` failed today because of this)
|
|
||||||
- **513** — Startup reconcile pass for state-drift detection (scaffolding; deletes itself when migration completes)
|
|
||||||
- **514** — `delete_story` should do a full cleanup (DB row + CRDT op + worktree + timers + filesystem)
|
|
||||||
- **515** — Add a debug MCP tool to dump the in-memory CRDT
|
|
||||||
- **516** — `update_story.description` should create the section if it doesn't exist
|
|
||||||
- **517** — Remove filesystem-shadow fallback paths from `lifecycle.rs`
|
|
||||||
- **518** — `apply_and_persist` should log `persist_tx.send()` failures instead of silently dropping ops
|
|
||||||
- **519** — Mergemaster should detect "no commits ahead of master" and fail loudly instead of exiting silently and burning $0.82 per session
|
|
||||||
- **520** — 🔑 **Typed pipeline state machine in Rust** — the foundational architectural story everything else converges to. Subsumes refactor 436.
|
|
||||||
|
|
||||||
**Refactor 436** (was: "Unify story stuck states into a single status field") — marked superseded by 520 via `front_matter: superseded_by: "520"`. Its functionality is now part of `Stage::Archived { reason: ArchiveReason }` in the sketch.
|
|
||||||
|
|
||||||
## Recommended next-session priority order
|
|
||||||
|
|
||||||
1. **Fix bug 511 first** (CRDT lamport seq reset). ~30 lines in `crdt_state.rs::init()`. After CRDT replay, seed the local seq counter from `MAX(seq)` over own author. Without this, CRDT replay produces broken state and 510 keeps biting.
|
|
||||||
2. **Verify the 511 fix unblocks 510.** Hypothesis: 510 (filesystem shadow split-brain) is largely a downstream symptom of 511 (replay puts ops in wrong order, in-memory state diverges, materialiser re-creates shadows from old state). If true, 510 may need only a small additional cleanup pass.
|
|
||||||
3. **Read the state machine sketch and refine it.** Specifically:
|
|
||||||
- Verify the local-vs-syncable field partition is right
|
|
||||||
- Confirm `Stage::Merge` and `Stage::Done` carry exactly the data we need
|
|
||||||
- Add any missing transitions
|
|
||||||
- Decide whether `ExecutionState` should be in the same CRDT or a separate one (we tentatively chose the same CRDT under per-node-pubkey keys, for cross-node observability and heartbeat)
|
|
||||||
4. **Land story 520** — promote the sketch to a real `server/src/pipeline_state.rs` module. Implement the projection layer (`TryFrom<&PipelineItemCrdt> for PipelineItem`).
|
|
||||||
5. **Migrate consumers one at a time** in priority order: chat commands (512) → lifecycle (517) → delete_story (514) → mergemaster precondition (519, mostly subsumed by `NonZeroU32`).
|
|
||||||
6. **Once nothing reads the loose `PipelineItemView` anymore, delete the loose API.** The CRDT looseness becomes purely an implementation detail.
|
|
||||||
7. **Then the off-leash commit forensic** — investigate `rogue-commit-2026-04-09-ac9f3ecf`. How did an agent acquire `git push` capability? What code path enabled it? File a security-critical bug.
|
|
||||||
|
|
||||||
## What's currently weird / broken in the running system
|
|
||||||
|
|
||||||
- **`timers.json` keeps getting re-populated** even after we empty it. The cause: stopping an agent triggers the agent's exit handler, which calls the rate-limit auto-resume scheduler, which writes to `timers.json`. Bug 501 should cover this but it might need to be explicit about the stop-agent code path.
|
|
||||||
- **Chat commands can't find stories that have no filesystem shadow.** Bug 512. Workaround: use MCP `move_story` / `delete_story` / etc. directly, NOT the web UI chat commands.
|
|
||||||
- **The web UI shows stale state** for some stories because the API reads from the in-memory CRDT view, which can diverge from `pipeline_items`. This will be fixed naturally by 520 + 517 (single source of truth).
|
|
||||||
- **`create_worktree` always creates from master** — intentional design choice ("keep conflicts low") but means it can't reuse an existing feature branch's work. Bit us with 478 today.
|
|
||||||
- **Mergemaster's `merge_agent_work` exits silently** when there are no commits ahead of master — we lost ~$0.82 to one such session today. Bug 519 + the typed `NonZeroU32` constraint in story 520 will make this unrepresentable.
|
|
||||||
|
|
||||||
## Useful diagnostic recipes from today
|
|
||||||
|
|
||||||
- **View persisted CRDT ops:** `sqlite3 .huskies/pipeline.db "SELECT seq, substr(op_json, 1, 200) FROM crdt_ops ORDER BY seq DESC LIMIT 20"`
|
|
||||||
- **View in-memory CRDT pipeline state:** call `mcp__huskies__get_pipeline_status` (it goes through `crdt_state::read_all_items()`)
|
|
||||||
- **Tail server log filtered for bug 502 firings:** `tail -f .huskies/logs/server.log | grep --line-buffered "Failed to start mergemaster"`
|
|
||||||
- **Tail server log without `[pty-debug]` noise:** `tail -f .huskies/logs/server.log | grep -v "\[pty-debug\]"`
|
|
||||||
- **Check current pending timers:** `cat .huskies/timers.json`
|
|
||||||
- **Forensically delete a story across all four state machines:** stop agents → remove worktree → empty timers → `DELETE FROM pipeline_items WHERE id LIKE '<id>%'` → `DELETE FROM crdt_ops WHERE op_json LIKE '%<id>%'`
|
|
||||||
|
|
||||||
## Token cost accounting
|
|
||||||
|
|
||||||
This session burned roughly **$15-25** in agent thrash, mostly from bug 501 + bug 510 respawning agents on already-completed stories. Once 511 + 510 + 501 are fixed, that bleed disappears.
|
|
||||||
|
|
||||||
## Open questions for the next session
|
|
||||||
|
|
||||||
1. **Should `ExecutionState` live in the same CRDT or a separate one?** We tentatively said same CRDT under per-node-pubkey keys. Need to validate this against the bft-json-crdt library's actual capabilities.
|
|
||||||
2. **Heartbeat cadence?** How often should `last_heartbeat` be updated for `ExecutionState::Running`? Every 30s seems reasonable but should be config.
|
|
||||||
3. **What's the migration path from existing pipeline_items rows to typed `PipelineItem`s?** A one-time migration script, or rebuild from `crdt_ops`?
|
|
||||||
4. **Should we add `statig` after all?** Probably not for the initial implementation, but worth revisiting if we end up wanting hierarchical states (e.g., a `Working` superstate sharing transitions across active stages).
|
|
||||||
Generated
+47
-41
@@ -229,7 +229,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||||||
checksum = "94893f1e0c6eeab764ade8dc4c0db24caf4fe7cbbaafc0eba0a9030f447b5185"
|
checksum = "94893f1e0c6eeab764ade8dc4c0db24caf4fe7cbbaafc0eba0a9030f447b5185"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"num-traits",
|
"num-traits",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -366,9 +366,9 @@ checksum = "c08606f8c3cbf4ce6ec8e28fb0014a2c086708fe954eaa885384a6165172e7e8"
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "aws-lc-rs"
|
name = "aws-lc-rs"
|
||||||
version = "1.16.2"
|
version = "1.16.3"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "a054912289d18629dc78375ba2c3726a3afe3ff71b4edba9dedfca0e3446d1fc"
|
checksum = "0ec6fb3fe69024a75fa7e1bfb48aa6cf59706a101658ea01bfd33b2b248a038f"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"aws-lc-sys",
|
"aws-lc-sys",
|
||||||
"zeroize",
|
"zeroize",
|
||||||
@@ -376,9 +376,9 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "aws-lc-sys"
|
name = "aws-lc-sys"
|
||||||
version = "0.39.1"
|
version = "0.40.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "83a25cf98105baa966497416dbd42565ce3a8cf8dbfd59803ec9ad46f3126399"
|
checksum = "f50037ee5e1e41e7b8f9d161680a725bd1626cb6f8c7e901f91f942850852fe7"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"cc",
|
"cc",
|
||||||
"cmake",
|
"cmake",
|
||||||
@@ -441,7 +441,7 @@ dependencies = [
|
|||||||
"criterion",
|
"criterion",
|
||||||
"fastcrypto",
|
"fastcrypto",
|
||||||
"indexmap 2.14.0",
|
"indexmap 2.14.0",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"random_color",
|
"random_color",
|
||||||
"serde",
|
"serde",
|
||||||
"serde_json",
|
"serde_json",
|
||||||
@@ -1649,7 +1649,7 @@ dependencies = [
|
|||||||
"num-bigint",
|
"num-bigint",
|
||||||
"once_cell",
|
"once_cell",
|
||||||
"p256",
|
"p256",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"readonly",
|
"readonly",
|
||||||
"rfc6979",
|
"rfc6979",
|
||||||
"rsa 0.8.2",
|
"rsa 0.8.2",
|
||||||
@@ -2288,7 +2288,7 @@ checksum = "df3b46402a9d5adb4c86a0cf463f42e19994e3ee891101b1841f30a545cb49a9"
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "huskies"
|
name = "huskies"
|
||||||
version = "0.10.2"
|
version = "0.10.4"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"async-stream",
|
"async-stream",
|
||||||
"async-trait",
|
"async-trait",
|
||||||
@@ -2802,9 +2802,9 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "konst"
|
name = "konst"
|
||||||
version = "0.3.16"
|
version = "0.3.17"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "4381b9b00c55f251f2ebe9473aef7c117e96828def1a7cb3bd3f0f903c6894e9"
|
checksum = "97feab15b395d1860944abe6a8dd8ed9f8eadfae01750fada8427abda531d887"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"const_panic",
|
"const_panic",
|
||||||
"konst_kernel",
|
"konst_kernel",
|
||||||
@@ -3165,7 +3165,7 @@ dependencies = [
|
|||||||
"js_option",
|
"js_option",
|
||||||
"matrix-sdk-common",
|
"matrix-sdk-common",
|
||||||
"pbkdf2",
|
"pbkdf2",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"rmp-serde",
|
"rmp-serde",
|
||||||
"ruma",
|
"ruma",
|
||||||
"serde",
|
"serde",
|
||||||
@@ -3255,7 +3255,7 @@ dependencies = [
|
|||||||
"getrandom 0.2.17",
|
"getrandom 0.2.17",
|
||||||
"hmac",
|
"hmac",
|
||||||
"pbkdf2",
|
"pbkdf2",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"rmp-serde",
|
"rmp-serde",
|
||||||
"serde",
|
"serde",
|
||||||
"serde_json",
|
"serde_json",
|
||||||
@@ -3509,7 +3509,7 @@ dependencies = [
|
|||||||
"num-integer",
|
"num-integer",
|
||||||
"num-iter",
|
"num-iter",
|
||||||
"num-traits",
|
"num-traits",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"smallvec",
|
"smallvec",
|
||||||
"zeroize",
|
"zeroize",
|
||||||
]
|
]
|
||||||
@@ -3570,7 +3570,7 @@ dependencies = [
|
|||||||
"chrono",
|
"chrono",
|
||||||
"getrandom 0.2.17",
|
"getrandom 0.2.17",
|
||||||
"http",
|
"http",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"reqwest 0.12.28",
|
"reqwest 0.12.28",
|
||||||
"serde",
|
"serde",
|
||||||
"serde_json",
|
"serde_json",
|
||||||
@@ -3726,7 +3726,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||||||
checksum = "3c80231409c20246a13fddb31776fb942c38553c51e871f8cbd687a4cfb5843d"
|
checksum = "3c80231409c20246a13fddb31776fb942c38553c51e871f8cbd687a4cfb5843d"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"phf_shared 0.11.3",
|
"phf_shared 0.11.3",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -4231,9 +4231,9 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "rand"
|
name = "rand"
|
||||||
version = "0.8.5"
|
version = "0.8.6"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "34af8d1a0e25924bc5b7c43c079c942339d8f0a8b57c39049bef581b46327404"
|
checksum = "5ca0ecfa931c29007047d1bc58e623ab12e5590e8c7cc53200d5202b69266d8a"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"libc",
|
"libc",
|
||||||
"rand_chacha 0.3.1",
|
"rand_chacha 0.3.1",
|
||||||
@@ -4693,7 +4693,7 @@ dependencies = [
|
|||||||
"js_int",
|
"js_int",
|
||||||
"konst",
|
"konst",
|
||||||
"percent-encoding",
|
"percent-encoding",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"regex",
|
"regex",
|
||||||
"ruma-identifiers-validation",
|
"ruma-identifiers-validation",
|
||||||
"ruma-macros",
|
"ruma-macros",
|
||||||
@@ -4803,7 +4803,7 @@ dependencies = [
|
|||||||
"base64",
|
"base64",
|
||||||
"ed25519-dalek",
|
"ed25519-dalek",
|
||||||
"pkcs8 0.10.2",
|
"pkcs8 0.10.2",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"ruma-common",
|
"ruma-common",
|
||||||
"serde_json",
|
"serde_json",
|
||||||
"sha2 0.10.9",
|
"sha2 0.10.9",
|
||||||
@@ -4952,9 +4952,9 @@ checksum = "f87165f0995f63a9fbeea62b64d10b4d9d8e78ec6d7d51fb2125fda7bb36788f"
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "rustls-webpki"
|
name = "rustls-webpki"
|
||||||
version = "0.103.12"
|
version = "0.103.13"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "8279bb85272c9f10811ae6a6c547ff594d6a7f3c6c6b02ee9726d1d0dcfcdd06"
|
checksum = "61c429a8649f110dddef65e2a5ad240f747e85f7758a6bccc7e5777bd33f756e"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"aws-lc-rs",
|
"aws-lc-rs",
|
||||||
"ring",
|
"ring",
|
||||||
@@ -5078,7 +5078,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||||||
checksum = "25996b82292a7a57ed3508f052cfff8640d38d32018784acd714758b43da9c8f"
|
checksum = "25996b82292a7a57ed3508f052cfff8640d38d32018784acd714758b43da9c8f"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"bitcoin_hashes",
|
"bitcoin_hashes",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"secp256k1-sys",
|
"secp256k1-sys",
|
||||||
]
|
]
|
||||||
|
|
||||||
@@ -5344,9 +5344,9 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "sha3"
|
name = "sha3"
|
||||||
version = "0.10.8"
|
version = "0.10.9"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "75872d278a8f37ef87fa0ddbda7802605cb18344497949862c0d4dcb291eba60"
|
checksum = "77fd7028345d415a4034cf8777cd4f8ab1851274233b45f84e3d955502d93874"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"digest 0.10.7",
|
"digest 0.10.7",
|
||||||
"keccak",
|
"keccak",
|
||||||
@@ -5587,7 +5587,7 @@ dependencies = [
|
|||||||
"md-5",
|
"md-5",
|
||||||
"memchr",
|
"memchr",
|
||||||
"percent-encoding",
|
"percent-encoding",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"rsa 0.9.10",
|
"rsa 0.9.10",
|
||||||
"sha1",
|
"sha1",
|
||||||
"sha2 0.10.9",
|
"sha2 0.10.9",
|
||||||
@@ -5623,7 +5623,7 @@ dependencies = [
|
|||||||
"log",
|
"log",
|
||||||
"md-5",
|
"md-5",
|
||||||
"memchr",
|
"memchr",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"serde",
|
"serde",
|
||||||
"serde_json",
|
"serde_json",
|
||||||
"sha2 0.10.9",
|
"sha2 0.10.9",
|
||||||
@@ -5996,9 +5996,9 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "tokio"
|
name = "tokio"
|
||||||
version = "1.52.0"
|
version = "1.52.1"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "a91135f59b1cbf38c91e73cf3386fca9bb77915c45ce2771460c9d92f0f3d776"
|
checksum = "b67dee974fe86fd92cc45b7a95fdd2f99a36a6d7b0d431a231178d3d670bbcc6"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"bytes",
|
"bytes",
|
||||||
"libc",
|
"libc",
|
||||||
@@ -6327,9 +6327,9 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "typenum"
|
name = "typenum"
|
||||||
version = "1.19.0"
|
version = "1.20.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "562d481066bde0658276a35467c4af00bdc6ee726305698a55b86e61d7ad82bb"
|
checksum = "40ce102ab67701b8526c123c1bab5cbe42d7040ccfd0f64af1a385808d2f43de"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "typewit"
|
name = "typewit"
|
||||||
@@ -6465,9 +6465,9 @@ checksum = "b6c140620e7ffbb22c2dee59cafe6084a59b5ffc27a8859a5f0d494b5d52b6be"
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "uuid"
|
name = "uuid"
|
||||||
version = "1.23.0"
|
version = "1.23.1"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "5ac8b6f42ead25368cf5b098aeb3dc8a1a2c05a3eee8a9a1a68c640edbfc79d9"
|
checksum = "ddd74a9687298c6858e9b88ec8935ec45d22e8fd5e6394fa1bd4e99a87789c76"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"getrandom 0.4.2",
|
"getrandom 0.4.2",
|
||||||
"js-sys",
|
"js-sys",
|
||||||
@@ -6512,7 +6512,7 @@ dependencies = [
|
|||||||
"hmac",
|
"hmac",
|
||||||
"matrix-pickle",
|
"matrix-pickle",
|
||||||
"prost",
|
"prost",
|
||||||
"rand 0.8.5",
|
"rand 0.8.6",
|
||||||
"serde",
|
"serde",
|
||||||
"serde_bytes",
|
"serde_bytes",
|
||||||
"serde_json",
|
"serde_json",
|
||||||
@@ -6580,11 +6580,11 @@ checksum = "ccf3ec651a847eb01de73ccad15eb7d99f80485de043efb2f370cd654f4ea44b"
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "wasip2"
|
name = "wasip2"
|
||||||
version = "1.0.2+wasi-0.2.9"
|
version = "1.0.3+wasi-0.2.9"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "9517f9239f02c069db75e65f174b3da828fe5f5b945c4dd26bd25d89c03ebcf5"
|
checksum = "20064672db26d7cdc89c7798c48a0fdfac8213434a1186e5ef29fd560ae223d6"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"wit-bindgen",
|
"wit-bindgen 0.57.1",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -6593,7 +6593,7 @@ version = "0.4.0+wasi-0.3.0-rc-2026-01-06"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "5428f8bf88ea5ddc08faddef2ac4a67e390b88186c703ce6dbd955e1c145aca5"
|
checksum = "5428f8bf88ea5ddc08faddef2ac4a67e390b88186c703ce6dbd955e1c145aca5"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"wit-bindgen",
|
"wit-bindgen 0.51.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -6770,18 +6770,18 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "webpki-root-certs"
|
name = "webpki-root-certs"
|
||||||
version = "1.0.6"
|
version = "1.0.7"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "804f18a4ac2676ffb4e8b5b5fa9ae38af06df08162314f96a68d2a363e21a8ca"
|
checksum = "f31141ce3fc3e300ae89b78c0dd67f9708061d1d2eda54b8209346fd6be9a92c"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"rustls-pki-types",
|
"rustls-pki-types",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "webpki-roots"
|
name = "webpki-roots"
|
||||||
version = "1.0.6"
|
version = "1.0.7"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "22cfaf3c063993ff62e73cb4311efde4db1efb31ab78a3e5c457939ad5cc0bed"
|
checksum = "52f5ee44c96cf55f1b349600768e3ece3a8f26010c05265ab73f945bb1a2eb9d"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"rustls-pki-types",
|
"rustls-pki-types",
|
||||||
]
|
]
|
||||||
@@ -7271,6 +7271,12 @@ dependencies = [
|
|||||||
"wit-bindgen-rust-macro",
|
"wit-bindgen-rust-macro",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "wit-bindgen"
|
||||||
|
version = "0.57.1"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "1ebf944e87a7c253233ad6766e082e3cd714b5d03812acc24c318f549614536e"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "wit-bindgen-core"
|
name = "wit-bindgen-core"
|
||||||
version = "0.51.0"
|
version = "0.51.0"
|
||||||
|
|||||||
Generated
+2
-2
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "huskies",
|
"name": "huskies",
|
||||||
"version": "0.10.2",
|
"version": "0.10.4",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "huskies",
|
"name": "huskies",
|
||||||
"version": "0.10.2",
|
"version": "0.10.4",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@types/react-syntax-highlighter": "^15.5.13",
|
"@types/react-syntax-highlighter": "^15.5.13",
|
||||||
"react": "^19.1.0",
|
"react": "^19.1.0",
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"name": "huskies",
|
"name": "huskies",
|
||||||
"private": true,
|
"private": true,
|
||||||
"version": "0.10.2",
|
"version": "0.10.4",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "vite",
|
"dev": "vite",
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||||
|
import type { ProjectSettings } from "./settings";
|
||||||
import { settingsApi } from "./settings";
|
import { settingsApi } from "./settings";
|
||||||
|
|
||||||
const mockFetch = vi.fn();
|
const mockFetch = vi.fn();
|
||||||
@@ -22,7 +23,77 @@ function errorResponse(status: number, text: string) {
|
|||||||
return new Response(text, { status });
|
return new Response(text, { status });
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const defaultProjectSettings: ProjectSettings = {
|
||||||
|
default_qa: "server",
|
||||||
|
default_coder_model: null,
|
||||||
|
max_coders: null,
|
||||||
|
max_retries: 2,
|
||||||
|
base_branch: null,
|
||||||
|
rate_limit_notifications: true,
|
||||||
|
timezone: null,
|
||||||
|
rendezvous: null,
|
||||||
|
watcher_sweep_interval_secs: 60,
|
||||||
|
watcher_done_retention_secs: 14400,
|
||||||
|
};
|
||||||
|
|
||||||
describe("settingsApi", () => {
|
describe("settingsApi", () => {
|
||||||
|
describe("getProjectSettings", () => {
|
||||||
|
it("sends GET to /settings and returns project settings", async () => {
|
||||||
|
mockFetch.mockResolvedValueOnce(okResponse(defaultProjectSettings));
|
||||||
|
|
||||||
|
const result = await settingsApi.getProjectSettings();
|
||||||
|
|
||||||
|
expect(mockFetch).toHaveBeenCalledWith(
|
||||||
|
"/api/settings",
|
||||||
|
expect.objectContaining({
|
||||||
|
headers: expect.objectContaining({
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
}),
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
expect(result).toEqual(defaultProjectSettings);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("uses custom baseUrl when provided", async () => {
|
||||||
|
mockFetch.mockResolvedValueOnce(okResponse(defaultProjectSettings));
|
||||||
|
await settingsApi.getProjectSettings("http://localhost:4000/api");
|
||||||
|
expect(mockFetch).toHaveBeenCalledWith(
|
||||||
|
"http://localhost:4000/api/settings",
|
||||||
|
expect.anything(),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("putProjectSettings", () => {
|
||||||
|
it("sends PUT to /settings with settings body", async () => {
|
||||||
|
const updated = { ...defaultProjectSettings, default_qa: "agent" };
|
||||||
|
mockFetch.mockResolvedValueOnce(okResponse(updated));
|
||||||
|
|
||||||
|
const result = await settingsApi.putProjectSettings(updated);
|
||||||
|
|
||||||
|
expect(mockFetch).toHaveBeenCalledWith(
|
||||||
|
"/api/settings",
|
||||||
|
expect.objectContaining({
|
||||||
|
method: "PUT",
|
||||||
|
body: JSON.stringify(updated),
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
expect(result.default_qa).toBe("agent");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("throws on validation error", async () => {
|
||||||
|
mockFetch.mockResolvedValueOnce(
|
||||||
|
errorResponse(400, "Invalid default_qa value"),
|
||||||
|
);
|
||||||
|
await expect(
|
||||||
|
settingsApi.putProjectSettings({
|
||||||
|
...defaultProjectSettings,
|
||||||
|
default_qa: "invalid",
|
||||||
|
}),
|
||||||
|
).rejects.toThrow("Invalid default_qa value");
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
describe("getEditorCommand", () => {
|
describe("getEditorCommand", () => {
|
||||||
it("sends GET to /settings/editor and returns editor settings", async () => {
|
it("sends GET to /settings/editor and returns editor settings", async () => {
|
||||||
const expected = { editor_command: "zed" };
|
const expected = { editor_command: "zed" };
|
||||||
|
|||||||
@@ -2,6 +2,19 @@ export interface EditorSettings {
|
|||||||
editor_command: string | null;
|
editor_command: string | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface ProjectSettings {
|
||||||
|
default_qa: string;
|
||||||
|
default_coder_model: string | null;
|
||||||
|
max_coders: number | null;
|
||||||
|
max_retries: number;
|
||||||
|
base_branch: string | null;
|
||||||
|
rate_limit_notifications: boolean;
|
||||||
|
timezone: string | null;
|
||||||
|
rendezvous: string | null;
|
||||||
|
watcher_sweep_interval_secs: number;
|
||||||
|
watcher_done_retention_secs: number;
|
||||||
|
}
|
||||||
|
|
||||||
export interface OpenFileResult {
|
export interface OpenFileResult {
|
||||||
success: boolean;
|
success: boolean;
|
||||||
}
|
}
|
||||||
@@ -34,6 +47,21 @@ async function requestJson<T>(
|
|||||||
}
|
}
|
||||||
|
|
||||||
export const settingsApi = {
|
export const settingsApi = {
|
||||||
|
getProjectSettings(baseUrl?: string): Promise<ProjectSettings> {
|
||||||
|
return requestJson<ProjectSettings>("/settings", {}, baseUrl);
|
||||||
|
},
|
||||||
|
|
||||||
|
putProjectSettings(
|
||||||
|
settings: ProjectSettings,
|
||||||
|
baseUrl?: string,
|
||||||
|
): Promise<ProjectSettings> {
|
||||||
|
return requestJson<ProjectSettings>(
|
||||||
|
"/settings",
|
||||||
|
{ method: "PUT", body: JSON.stringify(settings) },
|
||||||
|
baseUrl,
|
||||||
|
);
|
||||||
|
},
|
||||||
|
|
||||||
getEditorCommand(baseUrl?: string): Promise<EditorSettings> {
|
getEditorCommand(baseUrl?: string): Promise<EditorSettings> {
|
||||||
return requestJson<EditorSettings>("/settings/editor", {}, baseUrl);
|
return requestJson<EditorSettings>("/settings/editor", {}, baseUrl);
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -9,6 +9,7 @@ import { useChatWebSocket } from "../hooks/useChatWebSocket";
|
|||||||
import { estimateTokens, getContextWindowSize } from "../utils/chatUtils";
|
import { estimateTokens, getContextWindowSize } from "../utils/chatUtils";
|
||||||
import { ApiKeyDialog } from "./ApiKeyDialog";
|
import { ApiKeyDialog } from "./ApiKeyDialog";
|
||||||
import { BotConfigPage } from "./BotConfigPage";
|
import { BotConfigPage } from "./BotConfigPage";
|
||||||
|
import { SettingsPage } from "./SettingsPage";
|
||||||
import { ChatHeader } from "./ChatHeader";
|
import { ChatHeader } from "./ChatHeader";
|
||||||
import type { ChatInputHandle } from "./ChatInput";
|
import type { ChatInputHandle } from "./ChatInput";
|
||||||
import { ChatInput } from "./ChatInput";
|
import { ChatInput } from "./ChatInput";
|
||||||
@@ -62,7 +63,7 @@ export function Chat({
|
|||||||
null,
|
null,
|
||||||
);
|
);
|
||||||
const [showHelp, setShowHelp] = useState(false);
|
const [showHelp, setShowHelp] = useState(false);
|
||||||
const [view, setView] = useState<"chat" | "bot-config">("chat");
|
const [view, setView] = useState<"chat" | "bot-config" | "settings">("chat");
|
||||||
const [queuedMessages, setQueuedMessages] = useState<
|
const [queuedMessages, setQueuedMessages] = useState<
|
||||||
{ id: string; text: string }[]
|
{ id: string; text: string }[]
|
||||||
>([]);
|
>([]);
|
||||||
@@ -376,16 +377,21 @@ export function Chat({
|
|||||||
wsConnected={wsConnected}
|
wsConnected={wsConnected}
|
||||||
oauthStatus={oauthStatus}
|
oauthStatus={oauthStatus}
|
||||||
onShowBotConfig={() => setView("bot-config")}
|
onShowBotConfig={() => setView("bot-config")}
|
||||||
|
onShowSettings={() => setView("settings")}
|
||||||
/>
|
/>
|
||||||
|
|
||||||
{view === "bot-config" && (
|
{view === "bot-config" && (
|
||||||
<BotConfigPage onBack={() => setView("chat")} />
|
<BotConfigPage onBack={() => setView("chat")} />
|
||||||
)}
|
)}
|
||||||
|
|
||||||
|
{view === "settings" && (
|
||||||
|
<SettingsPage onBack={() => setView("chat")} />
|
||||||
|
)}
|
||||||
|
|
||||||
<div
|
<div
|
||||||
data-testid="chat-content-area"
|
data-testid="chat-content-area"
|
||||||
style={{
|
style={{
|
||||||
display: view === "bot-config" ? "none" : "flex",
|
display: view === "chat" ? "flex" : "none",
|
||||||
flex: 1,
|
flex: 1,
|
||||||
minHeight: 0,
|
minHeight: 0,
|
||||||
flexDirection: isNarrowScreen ? "column" : "row",
|
flexDirection: isNarrowScreen ? "column" : "row",
|
||||||
|
|||||||
@@ -35,6 +35,7 @@ interface ChatHeaderProps {
|
|||||||
wsConnected: boolean;
|
wsConnected: boolean;
|
||||||
oauthStatus?: OAuthStatus | null;
|
oauthStatus?: OAuthStatus | null;
|
||||||
onShowBotConfig?: () => void;
|
onShowBotConfig?: () => void;
|
||||||
|
onShowSettings?: () => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
const getContextEmoji = (percentage: number): string => {
|
const getContextEmoji = (percentage: number): string => {
|
||||||
@@ -60,6 +61,7 @@ export function ChatHeader({
|
|||||||
wsConnected,
|
wsConnected,
|
||||||
oauthStatus = null,
|
oauthStatus = null,
|
||||||
onShowBotConfig,
|
onShowBotConfig,
|
||||||
|
onShowSettings,
|
||||||
}: ChatHeaderProps) {
|
}: ChatHeaderProps) {
|
||||||
const hasModelOptions = availableModels.length > 0 || claudeModels.length > 0;
|
const hasModelOptions = availableModels.length > 0 || claudeModels.length > 0;
|
||||||
const [showConfirm, setShowConfirm] = useState(false);
|
const [showConfirm, setShowConfirm] = useState(false);
|
||||||
@@ -552,6 +554,43 @@ export function ChatHeader({
|
|||||||
</button>
|
</button>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
|
{onShowSettings && (
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
onClick={onShowSettings}
|
||||||
|
title="Edit project.toml settings"
|
||||||
|
style={{
|
||||||
|
padding: "6px 12px",
|
||||||
|
borderRadius: "99px",
|
||||||
|
border: "none",
|
||||||
|
fontSize: "0.85em",
|
||||||
|
backgroundColor: "#2f2f2f",
|
||||||
|
color: "#888",
|
||||||
|
cursor: "pointer",
|
||||||
|
outline: "none",
|
||||||
|
transition: "all 0.2s",
|
||||||
|
}}
|
||||||
|
onMouseOver={(e) => {
|
||||||
|
e.currentTarget.style.backgroundColor = "#3f3f3f";
|
||||||
|
e.currentTarget.style.color = "#ccc";
|
||||||
|
}}
|
||||||
|
onMouseOut={(e) => {
|
||||||
|
e.currentTarget.style.backgroundColor = "#2f2f2f";
|
||||||
|
e.currentTarget.style.color = "#888";
|
||||||
|
}}
|
||||||
|
onFocus={(e) => {
|
||||||
|
e.currentTarget.style.backgroundColor = "#3f3f3f";
|
||||||
|
e.currentTarget.style.color = "#ccc";
|
||||||
|
}}
|
||||||
|
onBlur={(e) => {
|
||||||
|
e.currentTarget.style.backgroundColor = "#2f2f2f";
|
||||||
|
e.currentTarget.style.color = "#888";
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
⚙ Settings
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
|
||||||
{hasModelOptions ? (
|
{hasModelOptions ? (
|
||||||
<select
|
<select
|
||||||
value={model}
|
value={model}
|
||||||
|
|||||||
@@ -0,0 +1,461 @@
|
|||||||
|
import * as React from "react";
|
||||||
|
import type { ProjectSettings } from "../api/settings";
|
||||||
|
import { settingsApi } from "../api/settings";
|
||||||
|
|
||||||
|
const { useState, useEffect } = React;
|
||||||
|
|
||||||
|
interface SettingsPageProps {
|
||||||
|
onBack: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
const fieldStyle: React.CSSProperties = {
|
||||||
|
display: "flex",
|
||||||
|
flexDirection: "column",
|
||||||
|
gap: "4px",
|
||||||
|
};
|
||||||
|
|
||||||
|
const labelStyle: React.CSSProperties = {
|
||||||
|
fontSize: "0.8em",
|
||||||
|
color: "#aaa",
|
||||||
|
fontWeight: 500,
|
||||||
|
};
|
||||||
|
|
||||||
|
const descStyle: React.CSSProperties = {
|
||||||
|
fontSize: "0.75em",
|
||||||
|
color: "#666",
|
||||||
|
marginTop: "2px",
|
||||||
|
};
|
||||||
|
|
||||||
|
const inputStyle: React.CSSProperties = {
|
||||||
|
padding: "8px 10px",
|
||||||
|
borderRadius: "6px",
|
||||||
|
border: "1px solid #333",
|
||||||
|
background: "#1e1e1e",
|
||||||
|
color: "#ececec",
|
||||||
|
fontSize: "0.9em",
|
||||||
|
fontFamily: "monospace",
|
||||||
|
outline: "none",
|
||||||
|
};
|
||||||
|
|
||||||
|
const sectionStyle: React.CSSProperties = {
|
||||||
|
background: "#1e1e1e",
|
||||||
|
border: "1px solid #333",
|
||||||
|
borderRadius: "8px",
|
||||||
|
padding: "20px",
|
||||||
|
display: "flex",
|
||||||
|
flexDirection: "column",
|
||||||
|
gap: "16px",
|
||||||
|
};
|
||||||
|
|
||||||
|
const sectionTitleStyle: React.CSSProperties = {
|
||||||
|
fontSize: "0.85em",
|
||||||
|
fontWeight: 600,
|
||||||
|
color: "#aaa",
|
||||||
|
textTransform: "uppercase",
|
||||||
|
letterSpacing: "0.06em",
|
||||||
|
marginBottom: "2px",
|
||||||
|
};
|
||||||
|
|
||||||
|
interface TextFieldProps {
|
||||||
|
label: string;
|
||||||
|
description?: string;
|
||||||
|
value: string;
|
||||||
|
onChange: (v: string) => void;
|
||||||
|
placeholder?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
function TextField({ label, description, value, onChange, placeholder }: TextFieldProps) {
|
||||||
|
return (
|
||||||
|
<div style={fieldStyle}>
|
||||||
|
<label style={labelStyle}>{label}</label>
|
||||||
|
{description && <span style={descStyle}>{description}</span>}
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={value}
|
||||||
|
onChange={(e) => onChange(e.target.value)}
|
||||||
|
placeholder={placeholder ?? ""}
|
||||||
|
style={inputStyle}
|
||||||
|
autoComplete="off"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
interface NumberFieldProps {
|
||||||
|
label: string;
|
||||||
|
description?: string;
|
||||||
|
value: number | null;
|
||||||
|
onChange: (v: number | null) => void;
|
||||||
|
min?: number;
|
||||||
|
placeholder?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
function NumberField({ label, description, value, onChange, min, placeholder }: NumberFieldProps) {
|
||||||
|
return (
|
||||||
|
<div style={fieldStyle}>
|
||||||
|
<label style={labelStyle}>{label}</label>
|
||||||
|
{description && <span style={descStyle}>{description}</span>}
|
||||||
|
<input
|
||||||
|
type="number"
|
||||||
|
value={value === null ? "" : value}
|
||||||
|
min={min}
|
||||||
|
onChange={(e) => {
|
||||||
|
const raw = e.target.value.trim();
|
||||||
|
if (raw === "") {
|
||||||
|
onChange(null);
|
||||||
|
} else {
|
||||||
|
const n = Number(raw);
|
||||||
|
if (!Number.isNaN(n)) onChange(n);
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
placeholder={placeholder ?? ""}
|
||||||
|
style={inputStyle}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
interface CheckboxFieldProps {
|
||||||
|
label: string;
|
||||||
|
description?: string;
|
||||||
|
checked: boolean;
|
||||||
|
onChange: (v: boolean) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
function CheckboxField({ label, description, checked, onChange }: CheckboxFieldProps) {
|
||||||
|
return (
|
||||||
|
<div style={fieldStyle}>
|
||||||
|
{description && <span style={descStyle}>{description}</span>}
|
||||||
|
<label
|
||||||
|
style={{
|
||||||
|
display: "flex",
|
||||||
|
alignItems: "center",
|
||||||
|
gap: "8px",
|
||||||
|
cursor: "pointer",
|
||||||
|
fontSize: "0.9em",
|
||||||
|
color: "#ccc",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
checked={checked}
|
||||||
|
onChange={(e) => onChange(e.target.checked)}
|
||||||
|
/>
|
||||||
|
{label}
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const QA_MODES = ["server", "agent", "human"] as const;
|
||||||
|
|
||||||
|
/** Settings page — form-based editor for project.toml scalar settings. */
|
||||||
|
export function SettingsPage({ onBack }: SettingsPageProps) {
|
||||||
|
const [settings, setSettings] = useState<ProjectSettings | null>(null);
|
||||||
|
const [status, setStatus] = useState<"idle" | "loading" | "saving" | "saved" | "error">("loading");
|
||||||
|
const [errorMsg, setErrorMsg] = useState<string | null>(null);
|
||||||
|
const [validationErrors, setValidationErrors] = useState<Record<string, string>>({});
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
settingsApi
|
||||||
|
.getProjectSettings()
|
||||||
|
.then((s) => {
|
||||||
|
setSettings(s);
|
||||||
|
setStatus("idle");
|
||||||
|
})
|
||||||
|
.catch((e: unknown) => {
|
||||||
|
setStatus("error");
|
||||||
|
setErrorMsg(e instanceof Error ? e.message : "Failed to load settings");
|
||||||
|
});
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
function patch(partial: Partial<ProjectSettings>) {
|
||||||
|
setSettings((prev) => (prev ? { ...prev, ...partial } : prev));
|
||||||
|
setValidationErrors({});
|
||||||
|
}
|
||||||
|
|
||||||
|
function validate(s: ProjectSettings): Record<string, string> {
|
||||||
|
const errors: Record<string, string> = {};
|
||||||
|
if (!QA_MODES.includes(s.default_qa as (typeof QA_MODES)[number])) {
|
||||||
|
errors.default_qa = `Must be one of: ${QA_MODES.join(", ")}`;
|
||||||
|
}
|
||||||
|
if (s.max_retries < 0) {
|
||||||
|
errors.max_retries = "Must be 0 or greater";
|
||||||
|
}
|
||||||
|
if (s.watcher_sweep_interval_secs < 1) {
|
||||||
|
errors.watcher_sweep_interval_secs = "Must be at least 1 second";
|
||||||
|
}
|
||||||
|
if (s.watcher_done_retention_secs < 1) {
|
||||||
|
errors.watcher_done_retention_secs = "Must be at least 1 second";
|
||||||
|
}
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function handleSave() {
|
||||||
|
if (!settings) return;
|
||||||
|
const errors = validate(settings);
|
||||||
|
if (Object.keys(errors).length > 0) {
|
||||||
|
setValidationErrors(errors);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
setStatus("saving");
|
||||||
|
setErrorMsg(null);
|
||||||
|
try {
|
||||||
|
const saved = await settingsApi.putProjectSettings(settings);
|
||||||
|
setSettings(saved);
|
||||||
|
setStatus("saved");
|
||||||
|
setTimeout(() => setStatus("idle"), 2000);
|
||||||
|
} catch (e) {
|
||||||
|
setStatus("error");
|
||||||
|
setErrorMsg(e instanceof Error ? e.message : "Save failed");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const s = settings;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
style={{
|
||||||
|
display: "flex",
|
||||||
|
flexDirection: "column",
|
||||||
|
height: "100%",
|
||||||
|
backgroundColor: "#171717",
|
||||||
|
color: "#ececec",
|
||||||
|
overflow: "auto",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{/* Header */}
|
||||||
|
<div
|
||||||
|
style={{
|
||||||
|
padding: "12px 24px",
|
||||||
|
borderBottom: "1px solid #333",
|
||||||
|
display: "flex",
|
||||||
|
alignItems: "center",
|
||||||
|
gap: "16px",
|
||||||
|
background: "#171717",
|
||||||
|
flexShrink: 0,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
onClick={onBack}
|
||||||
|
style={{
|
||||||
|
background: "transparent",
|
||||||
|
border: "none",
|
||||||
|
cursor: "pointer",
|
||||||
|
color: "#888",
|
||||||
|
fontSize: "0.9em",
|
||||||
|
padding: "4px 8px",
|
||||||
|
borderRadius: "4px",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
← Back
|
||||||
|
</button>
|
||||||
|
<span style={{ fontWeight: 700, fontSize: "1em" }}>Project Settings</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Body */}
|
||||||
|
<div
|
||||||
|
style={{
|
||||||
|
flex: 1,
|
||||||
|
padding: "24px",
|
||||||
|
display: "flex",
|
||||||
|
flexDirection: "column",
|
||||||
|
gap: "20px",
|
||||||
|
maxWidth: "640px",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{status === "loading" && (
|
||||||
|
<p style={{ color: "#888", fontSize: "0.9em" }}>Loading settings…</p>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{status === "error" && !s && (
|
||||||
|
<p style={{ color: "#f08080", fontSize: "0.9em" }}>
|
||||||
|
Error: {errorMsg}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{s && (
|
||||||
|
<>
|
||||||
|
{/* Pipeline */}
|
||||||
|
<div style={sectionStyle}>
|
||||||
|
<div style={sectionTitleStyle}>Pipeline</div>
|
||||||
|
|
||||||
|
<div style={fieldStyle}>
|
||||||
|
<label style={labelStyle}>Default QA Mode</label>
|
||||||
|
<span style={descStyle}>
|
||||||
|
How stories are QA-reviewed after the coder stage.
|
||||||
|
Default: server.
|
||||||
|
</span>
|
||||||
|
<select
|
||||||
|
value={s.default_qa}
|
||||||
|
onChange={(e) => patch({ default_qa: e.target.value })}
|
||||||
|
style={{ ...inputStyle, cursor: "pointer" }}
|
||||||
|
>
|
||||||
|
{QA_MODES.map((m) => (
|
||||||
|
<option key={m} value={m}>
|
||||||
|
{m}
|
||||||
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
{validationErrors.default_qa && (
|
||||||
|
<span style={{ color: "#f08080", fontSize: "0.8em" }}>
|
||||||
|
{validationErrors.default_qa}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<NumberField
|
||||||
|
label="Max Retries"
|
||||||
|
description="Maximum retries per story per pipeline stage before blocking. Default: 2. Set 0 to disable."
|
||||||
|
value={s.max_retries}
|
||||||
|
min={0}
|
||||||
|
onChange={(v) => patch({ max_retries: v ?? 0 })}
|
||||||
|
/>
|
||||||
|
{validationErrors.max_retries && (
|
||||||
|
<span style={{ color: "#f08080", fontSize: "0.8em" }}>
|
||||||
|
{validationErrors.max_retries}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<NumberField
|
||||||
|
label="Max Concurrent Coders"
|
||||||
|
description="Maximum number of coder-stage agents running at once. Leave blank for unlimited."
|
||||||
|
value={s.max_coders}
|
||||||
|
min={1}
|
||||||
|
placeholder="unlimited"
|
||||||
|
onChange={(v) => patch({ max_coders: v })}
|
||||||
|
/>
|
||||||
|
|
||||||
|
<TextField
|
||||||
|
label="Default Coder Model"
|
||||||
|
description="When set, only coder agents matching this model are auto-assigned (e.g. sonnet, opus)."
|
||||||
|
value={s.default_coder_model ?? ""}
|
||||||
|
onChange={(v) =>
|
||||||
|
patch({ default_coder_model: v.trim() || null })
|
||||||
|
}
|
||||||
|
placeholder="e.g. sonnet"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Git */}
|
||||||
|
<div style={sectionStyle}>
|
||||||
|
<div style={sectionTitleStyle}>Git</div>
|
||||||
|
|
||||||
|
<TextField
|
||||||
|
label="Base Branch"
|
||||||
|
description="Overrides auto-detection of the merge target branch (e.g. main, master, develop)."
|
||||||
|
value={s.base_branch ?? ""}
|
||||||
|
onChange={(v) =>
|
||||||
|
patch({ base_branch: v.trim() || null })
|
||||||
|
}
|
||||||
|
placeholder="e.g. master"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Notifications */}
|
||||||
|
<div style={sectionStyle}>
|
||||||
|
<div style={sectionTitleStyle}>Notifications</div>
|
||||||
|
|
||||||
|
<CheckboxField
|
||||||
|
label="Rate Limit Notifications"
|
||||||
|
description="Send chat notifications on soft API rate-limit warnings. Disable to reduce noise."
|
||||||
|
checked={s.rate_limit_notifications}
|
||||||
|
onChange={(v) => patch({ rate_limit_notifications: v })}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Advanced */}
|
||||||
|
<div style={sectionStyle}>
|
||||||
|
<div style={sectionTitleStyle}>Advanced</div>
|
||||||
|
|
||||||
|
<TextField
|
||||||
|
label="Timezone"
|
||||||
|
description="IANA timezone for timer inputs (e.g. Europe/London, America/New_York). Leave blank for system default."
|
||||||
|
value={s.timezone ?? ""}
|
||||||
|
onChange={(v) => patch({ timezone: v.trim() || null })}
|
||||||
|
placeholder="e.g. Europe/London"
|
||||||
|
/>
|
||||||
|
|
||||||
|
<TextField
|
||||||
|
label="Rendezvous URL"
|
||||||
|
description="WebSocket URL of a remote huskies node for CRDT state sync (e.g. ws://host:3001/crdt-sync)."
|
||||||
|
value={s.rendezvous ?? ""}
|
||||||
|
onChange={(v) => patch({ rendezvous: v.trim() || null })}
|
||||||
|
placeholder="e.g. ws://host:3001/crdt-sync"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Watcher */}
|
||||||
|
<div style={sectionStyle}>
|
||||||
|
<div style={sectionTitleStyle}>Archiver</div>
|
||||||
|
|
||||||
|
<NumberField
|
||||||
|
label="Sweep Interval (seconds)"
|
||||||
|
description="How often to check the done stage for items ready to archive. Default: 60."
|
||||||
|
value={s.watcher_sweep_interval_secs}
|
||||||
|
min={1}
|
||||||
|
onChange={(v) =>
|
||||||
|
patch({ watcher_sweep_interval_secs: v ?? 60 })
|
||||||
|
}
|
||||||
|
/>
|
||||||
|
{validationErrors.watcher_sweep_interval_secs && (
|
||||||
|
<span style={{ color: "#f08080", fontSize: "0.8em" }}>
|
||||||
|
{validationErrors.watcher_sweep_interval_secs}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<NumberField
|
||||||
|
label="Done Retention (seconds)"
|
||||||
|
description="How long an item must stay in the done stage before archiving. Default: 14400 (4 hours)."
|
||||||
|
value={s.watcher_done_retention_secs}
|
||||||
|
min={1}
|
||||||
|
onChange={(v) =>
|
||||||
|
patch({ watcher_done_retention_secs: v ?? 14400 })
|
||||||
|
}
|
||||||
|
/>
|
||||||
|
{validationErrors.watcher_done_retention_secs && (
|
||||||
|
<span style={{ color: "#f08080", fontSize: "0.8em" }}>
|
||||||
|
{validationErrors.watcher_done_retention_secs}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Save */}
|
||||||
|
<div style={{ display: "flex", alignItems: "center", gap: "12px" }}>
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
onClick={handleSave}
|
||||||
|
disabled={status === "saving"}
|
||||||
|
style={{
|
||||||
|
padding: "8px 24px",
|
||||||
|
borderRadius: "6px",
|
||||||
|
border: "none",
|
||||||
|
background:
|
||||||
|
status === "saved" ? "#1a5c2a" : "#2563eb",
|
||||||
|
color: "#fff",
|
||||||
|
cursor:
|
||||||
|
status === "saving" ? "not-allowed" : "pointer",
|
||||||
|
fontSize: "0.9em",
|
||||||
|
fontWeight: 600,
|
||||||
|
opacity: status === "saving" ? 0.7 : 1,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{status === "saving"
|
||||||
|
? "Saving…"
|
||||||
|
: status === "saved"
|
||||||
|
? "Saved!"
|
||||||
|
: "Save"}
|
||||||
|
</button>
|
||||||
|
{status === "error" && errorMsg && (
|
||||||
|
<span style={{ color: "#f08080", fontSize: "0.85em" }}>
|
||||||
|
{errorMsg}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
+1
-1
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "huskies"
|
name = "huskies"
|
||||||
version = "0.10.2"
|
version = "0.10.4"
|
||||||
edition = "2024"
|
edition = "2024"
|
||||||
build = "build.rs"
|
build = "build.rs"
|
||||||
|
|
||||||
|
|||||||
@@ -59,12 +59,17 @@ fn wizard_generate_reply(ctx: &CommandContext) -> String {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// Compose a status reply for the `setup` command (no args).
|
/// Compose a status reply for the `setup` command (no args).
|
||||||
|
///
|
||||||
|
/// If no wizard state exists, automatically initializes it so the user does
|
||||||
|
/// not need to run `huskies init` manually.
|
||||||
fn wizard_status_reply(ctx: &CommandContext) -> String {
|
fn wizard_status_reply(ctx: &CommandContext) -> String {
|
||||||
|
if WizardState::load(ctx.project_root).is_none() {
|
||||||
|
WizardState::init_if_missing(ctx.project_root);
|
||||||
|
}
|
||||||
match WizardState::load(ctx.project_root) {
|
match WizardState::load(ctx.project_root) {
|
||||||
Some(state) => format_wizard_state(&state),
|
Some(state) => format_wizard_state(&state),
|
||||||
None => {
|
None => "Unable to initialize setup wizard. Ensure the `.huskies/` directory exists."
|
||||||
"No setup wizard active. Run `huskies init` in the project root to begin.".to_string()
|
.to_string(),
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -205,13 +210,18 @@ mod tests {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn setup_no_wizard_returns_helpful_message() {
|
fn setup_no_wizard_auto_initializes() {
|
||||||
let dir = TempDir::new().unwrap();
|
let dir = TempDir::new().unwrap();
|
||||||
|
std::fs::create_dir_all(dir.path().join(".huskies")).unwrap();
|
||||||
let agents = Arc::new(crate::agents::AgentPool::new_test(4000));
|
let agents = Arc::new(crate::agents::AgentPool::new_test(4000));
|
||||||
let rooms = Arc::new(Mutex::new(HashSet::new()));
|
let rooms = Arc::new(Mutex::new(HashSet::new()));
|
||||||
let ctx = make_ctx("", dir.path(), &agents, &rooms);
|
let ctx = make_ctx("", dir.path(), &agents, &rooms);
|
||||||
let result = handle_setup(&ctx).unwrap();
|
let result = handle_setup(&ctx).unwrap();
|
||||||
assert!(result.contains("huskies init"));
|
// Bot should auto-initialize and return wizard status, not ask user to run huskies init.
|
||||||
|
assert!(result.contains("Setup wizard"));
|
||||||
|
assert!(!result.contains("huskies init"));
|
||||||
|
// Wizard state file should now exist.
|
||||||
|
assert!(WizardState::load(dir.path()).is_some());
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
|
|||||||
@@ -2,6 +2,65 @@
|
|||||||
|
|
||||||
use super::CommandContext;
|
use super::CommandContext;
|
||||||
|
|
||||||
|
/// Strip YAML front matter and return a summary of useful fields + the remaining body.
|
||||||
|
fn strip_front_matter(text: &str) -> (String, String) {
|
||||||
|
let trimmed = text.trim_start();
|
||||||
|
if !trimmed.starts_with("---") {
|
||||||
|
return (String::new(), text.to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find the closing ---
|
||||||
|
if let Some(end) = trimmed[3..].find("\n---") {
|
||||||
|
let yaml_block = &trimmed[3..3 + end].trim();
|
||||||
|
let body = &trimmed[3 + end + 4..]; // skip past closing ---
|
||||||
|
|
||||||
|
// Extract useful fields from YAML (simple line-based parsing)
|
||||||
|
let mut parts = Vec::new();
|
||||||
|
for line in yaml_block.lines() {
|
||||||
|
let line = line.trim();
|
||||||
|
if line.starts_with("depends_on:") {
|
||||||
|
let val = line.trim_start_matches("depends_on:").trim();
|
||||||
|
if !val.is_empty() && val != "[]" {
|
||||||
|
parts.push(format!("**Depends on:** {val}"));
|
||||||
|
}
|
||||||
|
} else if line.starts_with("agent:") {
|
||||||
|
let val = line.trim_start_matches("agent:").trim().trim_matches('"');
|
||||||
|
if !val.is_empty() {
|
||||||
|
parts.push(format!("**Agent:** {val}"));
|
||||||
|
}
|
||||||
|
} else if line.starts_with("blocked:") {
|
||||||
|
let val = line.trim_start_matches("blocked:").trim();
|
||||||
|
if val == "true" {
|
||||||
|
parts.push("**Blocked:** yes".to_string());
|
||||||
|
}
|
||||||
|
} else if line.starts_with("retry_count:") {
|
||||||
|
let val = line.trim_start_matches("retry_count:").trim();
|
||||||
|
if val != "0" && !val.is_empty() {
|
||||||
|
parts.push(format!("**Retries:** {val}"));
|
||||||
|
}
|
||||||
|
} else if line.starts_with("qa:") {
|
||||||
|
let val = line.trim_start_matches("qa:").trim().trim_matches('"');
|
||||||
|
if val == "human" {
|
||||||
|
parts.push("**QA:** human review required".to_string());
|
||||||
|
}
|
||||||
|
} else if line.starts_with("merge_failure:") {
|
||||||
|
let val = line
|
||||||
|
.trim_start_matches("merge_failure:")
|
||||||
|
.trim()
|
||||||
|
.trim_matches('"');
|
||||||
|
if !val.is_empty() {
|
||||||
|
parts.push(format!("**Merge failure:** {val}"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
(parts.join(" · "), body.to_string())
|
||||||
|
} else {
|
||||||
|
// No closing ---, return as-is
|
||||||
|
(String::new(), text.to_string())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/// Display the full markdown text of a work item identified by its numeric ID.
|
/// Display the full markdown text of a work item identified by its numeric ID.
|
||||||
///
|
///
|
||||||
/// Lookup priority: CRDT → content store → filesystem (Story 512).
|
/// Lookup priority: CRDT → content store → filesystem (Story 512).
|
||||||
@@ -34,9 +93,38 @@ pub(super) fn handle_show(ctx: &CommandContext) -> Option<String> {
|
|||||||
|
|
||||||
// `content` comes from the CRDT / content store. If unavailable, report
|
// `content` comes from the CRDT / content store. If unavailable, report
|
||||||
// it rather than silently reading a stale on-disk copy.
|
// it rather than silently reading a stale on-disk copy.
|
||||||
Some(content.unwrap_or_else(|| {
|
let text = content.unwrap_or_else(|| {
|
||||||
format!("Story {story_id} found in pipeline but its content is unavailable.")
|
format!("Story {story_id} found in pipeline but its content is unavailable.")
|
||||||
}))
|
});
|
||||||
|
|
||||||
|
// Strip front matter block and extract useful metadata to show inline.
|
||||||
|
let (front_matter_summary, body) = strip_front_matter(&text);
|
||||||
|
|
||||||
|
// Convert markdown headings to bold text for consistent rendering across
|
||||||
|
// Matrix clients. Element X doesn't style <h2> tags distinctly, but bold
|
||||||
|
// text renders consistently everywhere.
|
||||||
|
let formatted = body
|
||||||
|
.lines()
|
||||||
|
.map(|line| {
|
||||||
|
let trimmed = line.trim_start();
|
||||||
|
if let Some(rest) = trimmed.strip_prefix("### ") {
|
||||||
|
format!("\n**{}**", rest)
|
||||||
|
} else if let Some(rest) = trimmed.strip_prefix("## ") {
|
||||||
|
format!("\n**{}**", rest)
|
||||||
|
} else if let Some(rest) = trimmed.strip_prefix("# ") {
|
||||||
|
format!("\n**{}**", rest)
|
||||||
|
} else {
|
||||||
|
line.to_string()
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.collect::<Vec<_>>()
|
||||||
|
.join("\n");
|
||||||
|
|
||||||
|
if front_matter_summary.is_empty() {
|
||||||
|
Some(formatted.trim().to_string())
|
||||||
|
} else {
|
||||||
|
Some(format!("{front_matter_summary}\n{}", formatted.trim()))
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(test)]
|
#[cfg(test)]
|
||||||
|
|||||||
@@ -4,7 +4,7 @@ use crate::chat::ChatTransport;
|
|||||||
use crate::chat::timer::TimerStore;
|
use crate::chat::timer::TimerStore;
|
||||||
use crate::http::context::{PermissionDecision, PermissionForward};
|
use crate::http::context::{PermissionDecision, PermissionForward};
|
||||||
use matrix_sdk::ruma::{OwnedEventId, OwnedRoomId, OwnedUserId};
|
use matrix_sdk::ruma::{OwnedEventId, OwnedRoomId, OwnedUserId};
|
||||||
use std::collections::{HashMap, HashSet};
|
use std::collections::{BTreeMap, HashMap, HashSet};
|
||||||
use std::path::PathBuf;
|
use std::path::PathBuf;
|
||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use tokio::sync::Mutex as TokioMutex;
|
use tokio::sync::Mutex as TokioMutex;
|
||||||
@@ -65,6 +65,70 @@ pub struct BotContext {
|
|||||||
/// In gateway mode: valid project names accepted by the `switch` command.
|
/// In gateway mode: valid project names accepted by the `switch` command.
|
||||||
/// Empty in standalone mode.
|
/// Empty in standalone mode.
|
||||||
pub gateway_projects: Vec<String>,
|
pub gateway_projects: Vec<String>,
|
||||||
|
/// In gateway mode: mapping of project name → base URL (e.g. `"http://localhost:3001"`).
|
||||||
|
/// Used to proxy bot commands to the active project's `/api/bot/command` endpoint.
|
||||||
|
/// Empty in standalone mode.
|
||||||
|
pub gateway_project_urls: BTreeMap<String, String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl BotContext {
|
||||||
|
/// Resolve the effective project root for command dispatch.
|
||||||
|
///
|
||||||
|
/// In gateway mode the bot's `project_root` is the gateway config directory.
|
||||||
|
/// Each project lives in a subdirectory named after the project, so the
|
||||||
|
/// effective root for commands is `project_root / active_project_name`.
|
||||||
|
/// In standalone (single-project) mode this returns `project_root` unchanged.
|
||||||
|
pub async fn effective_project_root(&self) -> PathBuf {
|
||||||
|
if let Some(ref ap) = self.gateway_active_project {
|
||||||
|
let name = ap.read().await.clone();
|
||||||
|
self.project_root.join(&name)
|
||||||
|
} else {
|
||||||
|
self.project_root.clone()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Returns `true` if the bot is running in gateway mode.
|
||||||
|
pub fn is_gateway(&self) -> bool {
|
||||||
|
self.gateway_active_project.is_some()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Return the base URL for the currently active project, if in gateway mode.
|
||||||
|
pub async fn active_project_url(&self) -> Option<String> {
|
||||||
|
let ap = self.gateway_active_project.as_ref()?;
|
||||||
|
let name = ap.read().await.clone();
|
||||||
|
self.gateway_project_urls.get(&name).cloned()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Proxy a bot command to the active project's `/api/bot/command` endpoint.
|
||||||
|
///
|
||||||
|
/// Returns the Markdown response from the project server, or an error
|
||||||
|
/// message if the request failed.
|
||||||
|
pub async fn proxy_bot_command(&self, command: &str, args: &str) -> Option<String> {
|
||||||
|
let base_url = self.active_project_url().await?;
|
||||||
|
let url = format!("{base_url}/api/bot/command");
|
||||||
|
let client = reqwest::Client::new();
|
||||||
|
let body = serde_json::json!({
|
||||||
|
"command": command,
|
||||||
|
"args": args,
|
||||||
|
});
|
||||||
|
match client.post(&url).json(&body).send().await {
|
||||||
|
Ok(resp) if resp.status().is_success() => {
|
||||||
|
match resp.json::<serde_json::Value>().await {
|
||||||
|
Ok(json) => json
|
||||||
|
.get("response")
|
||||||
|
.and_then(|v| v.as_str())
|
||||||
|
.map(String::from),
|
||||||
|
Err(e) => Some(format!("Failed to parse response from project server: {e}")),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(resp) => Some(format!(
|
||||||
|
"Project server returned HTTP {}: {}",
|
||||||
|
resp.status(),
|
||||||
|
resp.text().await.unwrap_or_default()
|
||||||
|
)),
|
||||||
|
Err(e) => Some(format!("Failed to reach project server at {url}: {e}")),
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// ---------------------------------------------------------------------------
|
// ---------------------------------------------------------------------------
|
||||||
@@ -88,6 +152,135 @@ mod tests {
|
|||||||
assert_clone::<BotContext>();
|
assert_clone::<BotContext>();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn effective_project_root_standalone_returns_project_root() {
|
||||||
|
// In standalone mode (gateway_active_project is None), the effective root
|
||||||
|
// must equal the project_root exactly.
|
||||||
|
let (_perm_tx, perm_rx) = mpsc::unbounded_channel();
|
||||||
|
let ctx = BotContext {
|
||||||
|
bot_user_id: make_user_id("@bot:example.com"),
|
||||||
|
target_room_ids: vec![],
|
||||||
|
project_root: PathBuf::from("/projects/myapp"),
|
||||||
|
allowed_users: vec![],
|
||||||
|
history: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
history_size: 20,
|
||||||
|
bot_sent_event_ids: Arc::new(TokioMutex::new(std::collections::HashSet::new())),
|
||||||
|
perm_rx: Arc::new(TokioMutex::new(perm_rx)),
|
||||||
|
pending_perm_replies: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
permission_timeout_secs: 120,
|
||||||
|
bot_name: "Assistant".to_string(),
|
||||||
|
ambient_rooms: Arc::new(std::sync::Mutex::new(std::collections::HashSet::new())),
|
||||||
|
agents: Arc::new(crate::agents::AgentPool::new_test(3000)),
|
||||||
|
htop_sessions: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
transport: Arc::new(crate::chat::transport::whatsapp::WhatsAppTransport::new(
|
||||||
|
"test-phone".to_string(),
|
||||||
|
"test-token".to_string(),
|
||||||
|
"pipeline_notification".to_string(),
|
||||||
|
)),
|
||||||
|
timer_store: Arc::new(crate::chat::timer::TimerStore::load(
|
||||||
|
std::path::PathBuf::from("/tmp/timers.json"),
|
||||||
|
)),
|
||||||
|
gateway_active_project: None,
|
||||||
|
gateway_projects: vec![],
|
||||||
|
gateway_project_urls: BTreeMap::new(),
|
||||||
|
};
|
||||||
|
assert_eq!(
|
||||||
|
ctx.effective_project_root().await,
|
||||||
|
PathBuf::from("/projects/myapp")
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn effective_project_root_gateway_uses_active_project_subdir() {
|
||||||
|
// In gateway mode, the effective root must be config_dir / active_project_name.
|
||||||
|
let (_perm_tx, perm_rx) = mpsc::unbounded_channel();
|
||||||
|
let active = Arc::new(RwLock::new("huskies".to_string()));
|
||||||
|
let ctx = BotContext {
|
||||||
|
bot_user_id: make_user_id("@bot:example.com"),
|
||||||
|
target_room_ids: vec![],
|
||||||
|
project_root: PathBuf::from("/gateway"),
|
||||||
|
allowed_users: vec![],
|
||||||
|
history: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
history_size: 20,
|
||||||
|
bot_sent_event_ids: Arc::new(TokioMutex::new(std::collections::HashSet::new())),
|
||||||
|
perm_rx: Arc::new(TokioMutex::new(perm_rx)),
|
||||||
|
pending_perm_replies: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
permission_timeout_secs: 120,
|
||||||
|
bot_name: "Assistant".to_string(),
|
||||||
|
ambient_rooms: Arc::new(std::sync::Mutex::new(std::collections::HashSet::new())),
|
||||||
|
agents: Arc::new(crate::agents::AgentPool::new_test(3000)),
|
||||||
|
htop_sessions: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
transport: Arc::new(crate::chat::transport::whatsapp::WhatsAppTransport::new(
|
||||||
|
"test-phone".to_string(),
|
||||||
|
"test-token".to_string(),
|
||||||
|
"pipeline_notification".to_string(),
|
||||||
|
)),
|
||||||
|
timer_store: Arc::new(crate::chat::timer::TimerStore::load(
|
||||||
|
std::path::PathBuf::from("/tmp/timers.json"),
|
||||||
|
)),
|
||||||
|
gateway_active_project: Some(Arc::clone(&active)),
|
||||||
|
gateway_projects: vec!["huskies".into(), "robot-studio".into()],
|
||||||
|
gateway_project_urls: BTreeMap::from([
|
||||||
|
("huskies".into(), "http://localhost:3001".into()),
|
||||||
|
("robot-studio".into(), "http://localhost:3002".into()),
|
||||||
|
]),
|
||||||
|
};
|
||||||
|
assert_eq!(
|
||||||
|
ctx.effective_project_root().await,
|
||||||
|
PathBuf::from("/gateway/huskies")
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn effective_project_root_gateway_reflects_project_switch() {
|
||||||
|
// Switching the active project must change the effective root.
|
||||||
|
let (_perm_tx, perm_rx) = mpsc::unbounded_channel();
|
||||||
|
let active = Arc::new(RwLock::new("huskies".to_string()));
|
||||||
|
let ctx = BotContext {
|
||||||
|
bot_user_id: make_user_id("@bot:example.com"),
|
||||||
|
target_room_ids: vec![],
|
||||||
|
project_root: PathBuf::from("/gateway"),
|
||||||
|
allowed_users: vec![],
|
||||||
|
history: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
history_size: 20,
|
||||||
|
bot_sent_event_ids: Arc::new(TokioMutex::new(std::collections::HashSet::new())),
|
||||||
|
perm_rx: Arc::new(TokioMutex::new(perm_rx)),
|
||||||
|
pending_perm_replies: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
permission_timeout_secs: 120,
|
||||||
|
bot_name: "Assistant".to_string(),
|
||||||
|
ambient_rooms: Arc::new(std::sync::Mutex::new(std::collections::HashSet::new())),
|
||||||
|
agents: Arc::new(crate::agents::AgentPool::new_test(3000)),
|
||||||
|
htop_sessions: Arc::new(TokioMutex::new(std::collections::HashMap::new())),
|
||||||
|
transport: Arc::new(crate::chat::transport::whatsapp::WhatsAppTransport::new(
|
||||||
|
"test-phone".to_string(),
|
||||||
|
"test-token".to_string(),
|
||||||
|
"pipeline_notification".to_string(),
|
||||||
|
)),
|
||||||
|
timer_store: Arc::new(crate::chat::timer::TimerStore::load(
|
||||||
|
std::path::PathBuf::from("/tmp/timers.json"),
|
||||||
|
)),
|
||||||
|
gateway_active_project: Some(Arc::clone(&active)),
|
||||||
|
gateway_projects: vec!["huskies".into(), "robot-studio".into()],
|
||||||
|
gateway_project_urls: BTreeMap::from([
|
||||||
|
("huskies".into(), "http://localhost:3001".into()),
|
||||||
|
("robot-studio".into(), "http://localhost:3002".into()),
|
||||||
|
]),
|
||||||
|
};
|
||||||
|
|
||||||
|
assert_eq!(
|
||||||
|
ctx.effective_project_root().await,
|
||||||
|
PathBuf::from("/gateway/huskies")
|
||||||
|
);
|
||||||
|
|
||||||
|
// Simulate switch_project changing the active project.
|
||||||
|
*active.write().await = "robot-studio".to_string();
|
||||||
|
|
||||||
|
assert_eq!(
|
||||||
|
ctx.effective_project_root().await,
|
||||||
|
PathBuf::from("/gateway/robot-studio")
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn bot_context_has_no_require_verified_devices_field() {
|
fn bot_context_has_no_require_verified_devices_field() {
|
||||||
// Verification is always on — BotContext no longer has a toggle field.
|
// Verification is always on — BotContext no longer has a toggle field.
|
||||||
@@ -118,6 +311,7 @@ mod tests {
|
|||||||
)),
|
)),
|
||||||
gateway_active_project: None,
|
gateway_active_project: None,
|
||||||
gateway_projects: vec![],
|
gateway_projects: vec![],
|
||||||
|
gateway_project_urls: BTreeMap::new(),
|
||||||
};
|
};
|
||||||
// Clone must work (required by Matrix SDK event handler injection).
|
// Clone must work (required by Matrix SDK event handler injection).
|
||||||
let _cloned = ctx.clone();
|
let _cloned = ctx.clone();
|
||||||
|
|||||||
@@ -96,6 +96,49 @@ mod tests {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn markdown_to_html_heading_renders_as_h_tag() {
|
||||||
|
let html = markdown_to_html("## Section\nContent here.");
|
||||||
|
assert!(
|
||||||
|
html.contains("<h2>Section</h2>"),
|
||||||
|
"expected <h2> heading tag: {html}"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
html.contains("<p>Content here.</p>"),
|
||||||
|
"expected paragraph after heading: {html}"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn markdown_to_html_heading_with_preceding_prose_renders_correctly() {
|
||||||
|
let html = markdown_to_html("Intro text.\n## Section\nBody.");
|
||||||
|
assert!(
|
||||||
|
html.contains("<h2>Section</h2>"),
|
||||||
|
"expected <h2> heading tag: {html}"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
html.contains("<p>Intro text.</p>"),
|
||||||
|
"expected intro paragraph: {html}"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
html.contains("<p>Body.</p>"),
|
||||||
|
"expected body paragraph: {html}"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn markdown_to_html_multiple_headings_each_render_as_h_tags() {
|
||||||
|
let html = markdown_to_html("## Section 1\nContent one.\n\n## Section 2\nContent two.");
|
||||||
|
assert!(
|
||||||
|
html.contains("<h2>Section 1</h2>"),
|
||||||
|
"expected first <h2>: {html}"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
html.contains("<h2>Section 2</h2>"),
|
||||||
|
"expected second <h2>: {html}"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn startup_announcement_uses_bot_name() {
|
fn startup_announcement_uses_bot_name() {
|
||||||
assert_eq!(format_startup_announcement("Timmy"), "Timmy is online.");
|
assert_eq!(format_startup_announcement("Timmy"), "Timmy is online.");
|
||||||
|
|||||||
@@ -174,13 +174,71 @@ pub(super) async fn on_room_message(
|
|||||||
let user_message = body;
|
let user_message = body;
|
||||||
slog!("[matrix-bot] Message from {sender}: {user_message}");
|
slog!("[matrix-bot] Message from {sender}: {user_message}");
|
||||||
|
|
||||||
|
// In gateway mode, resolve commands against the active project's root directory.
|
||||||
|
// The gateway's own project_root is the gateway config dir; each project lives in
|
||||||
|
// a subdirectory named after the project. Standalone mode is unaffected.
|
||||||
|
let effective_root = ctx.effective_project_root().await;
|
||||||
|
|
||||||
|
// ── Gateway command proxy ───────────────────────────────────────────
|
||||||
|
// In gateway mode the bot has no local CRDT or project filesystem, so most
|
||||||
|
// commands must be forwarded to the active project's `/api/bot/command`
|
||||||
|
// endpoint. Only a small set of gateway-local commands are handled here.
|
||||||
|
if ctx.is_gateway() {
|
||||||
|
// Commands that are meaningful on the gateway itself (no project state needed).
|
||||||
|
const GATEWAY_LOCAL_COMMANDS: &[&str] = &["help", "ambient", "reset", "switch"];
|
||||||
|
|
||||||
|
let stripped = crate::chat::util::strip_bot_mention(
|
||||||
|
&user_message,
|
||||||
|
&ctx.bot_name,
|
||||||
|
ctx.bot_user_id.as_str(),
|
||||||
|
)
|
||||||
|
.trim()
|
||||||
|
.trim_start_matches(|c: char| !c.is_alphanumeric())
|
||||||
|
.to_string();
|
||||||
|
|
||||||
|
let (cmd, args) = match stripped.split_once(char::is_whitespace) {
|
||||||
|
Some((c, a)) => (c.to_ascii_lowercase(), a.trim().to_string()),
|
||||||
|
None => (stripped.to_ascii_lowercase(), String::new()),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Only proxy if the first word is a known bot command (sync or async).
|
||||||
|
let is_known_command = !cmd.is_empty()
|
||||||
|
&& !GATEWAY_LOCAL_COMMANDS.contains(&cmd.as_str())
|
||||||
|
&& (crate::chat::commands::commands()
|
||||||
|
.iter()
|
||||||
|
.any(|c| c.name == cmd)
|
||||||
|
|| [
|
||||||
|
"assign", "start", "delete", "rebuild", "rmtree", "htop", "timer",
|
||||||
|
]
|
||||||
|
.contains(&cmd.as_str()));
|
||||||
|
|
||||||
|
if is_known_command {
|
||||||
|
// Proxy to the active project server.
|
||||||
|
let response = match ctx.proxy_bot_command(&cmd, &args).await {
|
||||||
|
Some(r) => r,
|
||||||
|
None => "No active project selected or project URL not configured.".to_string(),
|
||||||
|
};
|
||||||
|
let html = markdown_to_html(&response);
|
||||||
|
if let Ok(msg_id) = ctx
|
||||||
|
.transport
|
||||||
|
.send_message(&room_id_str, &response, &html)
|
||||||
|
.await
|
||||||
|
&& let Ok(event_id) = msg_id.parse()
|
||||||
|
{
|
||||||
|
ctx.bot_sent_event_ids.lock().await.insert(event_id);
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// Gateway-local commands and freeform text fall through to normal handling below.
|
||||||
|
}
|
||||||
|
|
||||||
// Check for bot-level commands (help, status, ambient, …) before invoking
|
// Check for bot-level commands (help, status, ambient, …) before invoking
|
||||||
// the LLM. All commands are registered in commands.rs — no special-casing
|
// the LLM. All commands are registered in commands.rs — no special-casing
|
||||||
// needed here.
|
// needed here.
|
||||||
let dispatch = super::super::commands::CommandDispatch {
|
let dispatch = super::super::commands::CommandDispatch {
|
||||||
bot_name: &ctx.bot_name,
|
bot_name: &ctx.bot_name,
|
||||||
bot_user_id: ctx.bot_user_id.as_str(),
|
bot_user_id: ctx.bot_user_id.as_str(),
|
||||||
project_root: &ctx.project_root,
|
project_root: &effective_root,
|
||||||
agents: &ctx.agents,
|
agents: &ctx.agents,
|
||||||
ambient_rooms: &ctx.ambient_rooms,
|
ambient_rooms: &ctx.ambient_rooms,
|
||||||
room_id: &room_id_str,
|
room_id: &room_id_str,
|
||||||
@@ -219,7 +277,7 @@ pub(super) async fn on_room_message(
|
|||||||
&ctx.bot_name,
|
&ctx.bot_name,
|
||||||
&story_number,
|
&story_number,
|
||||||
&model,
|
&model,
|
||||||
&ctx.project_root,
|
&effective_root,
|
||||||
&ctx.agents,
|
&ctx.agents,
|
||||||
)
|
)
|
||||||
.await
|
.await
|
||||||
@@ -287,7 +345,7 @@ pub(super) async fn on_room_message(
|
|||||||
super::super::delete::handle_delete(
|
super::super::delete::handle_delete(
|
||||||
&ctx.bot_name,
|
&ctx.bot_name,
|
||||||
&story_number,
|
&story_number,
|
||||||
&ctx.project_root,
|
&effective_root,
|
||||||
&ctx.agents,
|
&ctx.agents,
|
||||||
)
|
)
|
||||||
.await
|
.await
|
||||||
@@ -321,7 +379,7 @@ pub(super) async fn on_room_message(
|
|||||||
super::super::rmtree::handle_rmtree(
|
super::super::rmtree::handle_rmtree(
|
||||||
&ctx.bot_name,
|
&ctx.bot_name,
|
||||||
&story_number,
|
&story_number,
|
||||||
&ctx.project_root,
|
&effective_root,
|
||||||
&ctx.agents,
|
&ctx.agents,
|
||||||
)
|
)
|
||||||
.await
|
.await
|
||||||
@@ -361,7 +419,7 @@ pub(super) async fn on_room_message(
|
|||||||
&ctx.bot_name,
|
&ctx.bot_name,
|
||||||
&story_number,
|
&story_number,
|
||||||
agent_hint.as_deref(),
|
agent_hint.as_deref(),
|
||||||
&ctx.project_root,
|
&effective_root,
|
||||||
&ctx.agents,
|
&ctx.agents,
|
||||||
)
|
)
|
||||||
.await
|
.await
|
||||||
@@ -587,7 +645,18 @@ pub(super) async fn handle_message(
|
|||||||
let sent_any_chunk = Arc::new(AtomicBool::new(false));
|
let sent_any_chunk = Arc::new(AtomicBool::new(false));
|
||||||
let sent_any_chunk_for_callback = Arc::clone(&sent_any_chunk);
|
let sent_any_chunk_for_callback = Arc::clone(&sent_any_chunk);
|
||||||
|
|
||||||
let project_root_str = ctx.project_root.to_string_lossy().to_string();
|
// In gateway mode, run Claude Code in the gateway config directory so it
|
||||||
|
// picks up the `.mcp.json` that points to the gateway's MCP proxy endpoint.
|
||||||
|
// The gateway proxies tool calls to the active project automatically.
|
||||||
|
// In standalone mode, use the project root directly.
|
||||||
|
let project_root_str = if ctx.is_gateway() {
|
||||||
|
ctx.project_root.to_string_lossy().to_string()
|
||||||
|
} else {
|
||||||
|
ctx.effective_project_root()
|
||||||
|
.await
|
||||||
|
.to_string_lossy()
|
||||||
|
.to_string()
|
||||||
|
};
|
||||||
let chat_fut = provider.chat_stream(
|
let chat_fut = provider.chat_stream(
|
||||||
&prompt,
|
&prompt,
|
||||||
&project_root_str,
|
&project_root_str,
|
||||||
|
|||||||
@@ -30,6 +30,7 @@ pub async fn run_bot(
|
|||||||
shutdown_rx: watch::Receiver<Option<crate::rebuild::ShutdownReason>>,
|
shutdown_rx: watch::Receiver<Option<crate::rebuild::ShutdownReason>>,
|
||||||
gateway_active_project: Option<Arc<RwLock<String>>>,
|
gateway_active_project: Option<Arc<RwLock<String>>>,
|
||||||
gateway_projects: Vec<String>,
|
gateway_projects: Vec<String>,
|
||||||
|
gateway_project_urls: std::collections::BTreeMap<String, String>,
|
||||||
) -> Result<(), String> {
|
) -> Result<(), String> {
|
||||||
let store_path = project_root.join(".huskies").join("matrix_store");
|
let store_path = project_root.join(".huskies").join("matrix_store");
|
||||||
let client = Client::builder()
|
let client = Client::builder()
|
||||||
@@ -247,6 +248,7 @@ pub async fn run_bot(
|
|||||||
timer_store,
|
timer_store,
|
||||||
gateway_active_project,
|
gateway_active_project,
|
||||||
gateway_projects,
|
gateway_projects,
|
||||||
|
gateway_project_urls,
|
||||||
};
|
};
|
||||||
|
|
||||||
slog!(
|
slog!(
|
||||||
|
|||||||
@@ -62,6 +62,7 @@ use tokio::sync::{Mutex as TokioMutex, RwLock, broadcast, mpsc, watch};
|
|||||||
/// Returns an [`tokio::task::AbortHandle`] if the bot was actually spawned (Matrix/Discord
|
/// Returns an [`tokio::task::AbortHandle`] if the bot was actually spawned (Matrix/Discord
|
||||||
/// transports), or `None` if the config is absent, disabled, or uses a webhook-based
|
/// transports), or `None` if the config is absent, disabled, or uses a webhook-based
|
||||||
/// transport (Slack/WhatsApp) that does not require a persistent background task.
|
/// transport (Slack/WhatsApp) that does not require a persistent background task.
|
||||||
|
#[allow(clippy::too_many_arguments)]
|
||||||
pub fn spawn_bot(
|
pub fn spawn_bot(
|
||||||
project_root: &Path,
|
project_root: &Path,
|
||||||
watcher_tx: broadcast::Sender<WatcherEvent>,
|
watcher_tx: broadcast::Sender<WatcherEvent>,
|
||||||
@@ -70,6 +71,7 @@ pub fn spawn_bot(
|
|||||||
shutdown_rx: watch::Receiver<Option<ShutdownReason>>,
|
shutdown_rx: watch::Receiver<Option<ShutdownReason>>,
|
||||||
gateway_active_project: Option<Arc<RwLock<String>>>,
|
gateway_active_project: Option<Arc<RwLock<String>>>,
|
||||||
gateway_projects: Vec<String>,
|
gateway_projects: Vec<String>,
|
||||||
|
gateway_project_urls: std::collections::BTreeMap<String, String>,
|
||||||
) -> Option<tokio::task::AbortHandle> {
|
) -> Option<tokio::task::AbortHandle> {
|
||||||
let config = match BotConfig::load(project_root) {
|
let config = match BotConfig::load(project_root) {
|
||||||
Some(c) => c,
|
Some(c) => c,
|
||||||
@@ -108,6 +110,7 @@ pub fn spawn_bot(
|
|||||||
shutdown_rx,
|
shutdown_rx,
|
||||||
gateway_active_project,
|
gateway_active_project,
|
||||||
gateway_projects,
|
gateway_projects,
|
||||||
|
gateway_project_urls,
|
||||||
)
|
)
|
||||||
.await
|
.await
|
||||||
{
|
{
|
||||||
|
|||||||
+50
-6
@@ -223,12 +223,24 @@ pub fn normalize_line_breaks(text: &str) -> String {
|
|||||||
|
|
||||||
let prev_line = lines[i - 1];
|
let prev_line = lines[i - 1];
|
||||||
|
|
||||||
// Insert a blank separator when both the current and previous lines
|
// ATX headings (lines starting with one or more `#` characters) always
|
||||||
// are non-empty prose (not inside a code fence, not structured Markdown).
|
// need a blank line before and after them so that Matrix clients render
|
||||||
|
// the heading with visual separation. Without a blank line, a single
|
||||||
|
// newline between a heading and adjacent text is swallowed by many
|
||||||
|
// Matrix clients (including Element X), joining the heading text and
|
||||||
|
// the following content on the same line without any heading formatting.
|
||||||
|
let is_cur_heading = line.trim_start().starts_with('#');
|
||||||
|
let is_prev_heading = prev_line.trim_start().starts_with('#');
|
||||||
|
|
||||||
|
// Insert a blank separator when:
|
||||||
|
// 1. Both lines are non-empty prose (standard prose-to-prose rule).
|
||||||
|
// 2. The current line is an ATX heading (adds blank line *before* it).
|
||||||
|
// 3. The previous line was an ATX heading (adds blank line *after* it).
|
||||||
let should_double = !line.is_empty()
|
let should_double = !line.is_empty()
|
||||||
&& !prev_line.is_empty()
|
&& !prev_line.is_empty()
|
||||||
&& !is_structured_line(line)
|
&& ((!is_structured_line(line) && !is_structured_line(prev_line))
|
||||||
&& !is_structured_line(prev_line);
|
|| is_cur_heading
|
||||||
|
|| is_prev_heading);
|
||||||
|
|
||||||
if should_double {
|
if should_double {
|
||||||
result.push("");
|
result.push("");
|
||||||
@@ -599,10 +611,42 @@ mod tests {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn normalize_heading_single_newline_preserved() {
|
fn normalize_heading_followed_by_prose_gets_blank_line() {
|
||||||
|
// A blank line must be inserted after a heading so Matrix clients render
|
||||||
|
// the heading with visual separation from the following paragraph.
|
||||||
let input = "# My Heading\nSome text below.";
|
let input = "# My Heading\nSome text below.";
|
||||||
let output = normalize_line_breaks(input);
|
let output = normalize_line_breaks(input);
|
||||||
assert_eq!(output, "# My Heading\nSome text below.");
|
assert_eq!(output, "# My Heading\n\nSome text below.");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn normalize_prose_before_heading_gets_blank_line() {
|
||||||
|
// A blank line must be inserted before a heading when prose precedes it.
|
||||||
|
let input = "Some intro text.\n## Section";
|
||||||
|
let output = normalize_line_breaks(input);
|
||||||
|
assert_eq!(output, "Some intro text.\n\n## Section");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn normalize_heading_surrounded_by_prose_gets_blank_lines_both_sides() {
|
||||||
|
let input = "Intro.\n## Heading\nContent.";
|
||||||
|
let output = normalize_line_breaks(input);
|
||||||
|
assert_eq!(output, "Intro.\n\n## Heading\n\nContent.");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn normalize_consecutive_headings_separated_by_blank_lines() {
|
||||||
|
let input = "## Section 1\n## Section 2";
|
||||||
|
let output = normalize_line_breaks(input);
|
||||||
|
assert_eq!(output, "## Section 1\n\n## Section 2");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn normalize_heading_already_separated_by_blank_line_unchanged() {
|
||||||
|
// When there is already a blank line, no extra blank is inserted.
|
||||||
|
let input = "# Heading\n\nContent.";
|
||||||
|
let output = normalize_line_breaks(input);
|
||||||
|
assert_eq!(output, "# Heading\n\nContent.");
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
|
|||||||
+108
-64
@@ -320,7 +320,9 @@ pub async fn gateway_mcp_post_handler(
|
|||||||
.unwrap_or("");
|
.unwrap_or("");
|
||||||
|
|
||||||
if GATEWAY_TOOLS.contains(&tool_name) {
|
if GATEWAY_TOOLS.contains(&tool_name) {
|
||||||
to_json_response(handle_gateway_tool(tool_name, &rpc.params, &state).await)
|
to_json_response(
|
||||||
|
handle_gateway_tool(tool_name, &rpc.params, &state, rpc.id.clone()).await,
|
||||||
|
)
|
||||||
} else {
|
} else {
|
||||||
// Proxy to active project's container.
|
// Proxy to active project's container.
|
||||||
match proxy_mcp_call(&state, &bytes).await {
|
match proxy_mcp_call(&state, &bytes).await {
|
||||||
@@ -482,18 +484,22 @@ async fn handle_gateway_tool(
|
|||||||
tool_name: &str,
|
tool_name: &str,
|
||||||
params: &Value,
|
params: &Value,
|
||||||
state: &GatewayState,
|
state: &GatewayState,
|
||||||
|
id: Option<Value>,
|
||||||
) -> JsonRpcResponse {
|
) -> JsonRpcResponse {
|
||||||
let id = None; // The caller wraps this in a proper response.
|
|
||||||
match tool_name {
|
match tool_name {
|
||||||
"switch_project" => handle_switch_project(params, state).await,
|
"switch_project" => handle_switch_project(params, state, id).await,
|
||||||
"gateway_status" => handle_gateway_status(state).await,
|
"gateway_status" => handle_gateway_status(state, id).await,
|
||||||
"gateway_health" => handle_gateway_health(state).await,
|
"gateway_health" => handle_gateway_health(state, id).await,
|
||||||
_ => JsonRpcResponse::error(id, -32601, format!("Unknown gateway tool: {tool_name}")),
|
_ => JsonRpcResponse::error(id, -32601, format!("Unknown gateway tool: {tool_name}")),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Switch the active project.
|
/// Switch the active project.
|
||||||
async fn handle_switch_project(params: &Value, state: &GatewayState) -> JsonRpcResponse {
|
async fn handle_switch_project(
|
||||||
|
params: &Value,
|
||||||
|
state: &GatewayState,
|
||||||
|
id: Option<Value>,
|
||||||
|
) -> JsonRpcResponse {
|
||||||
let project = params
|
let project = params
|
||||||
.get("arguments")
|
.get("arguments")
|
||||||
.and_then(|a| a.get("project"))
|
.and_then(|a| a.get("project"))
|
||||||
@@ -502,7 +508,7 @@ async fn handle_switch_project(params: &Value, state: &GatewayState) -> JsonRpcR
|
|||||||
.unwrap_or("");
|
.unwrap_or("");
|
||||||
|
|
||||||
if project.is_empty() {
|
if project.is_empty() {
|
||||||
return JsonRpcResponse::error(None, -32602, "missing required parameter: project".into());
|
return JsonRpcResponse::error(id, -32602, "missing required parameter: project".into());
|
||||||
}
|
}
|
||||||
|
|
||||||
let url = {
|
let url = {
|
||||||
@@ -510,7 +516,7 @@ async fn handle_switch_project(params: &Value, state: &GatewayState) -> JsonRpcR
|
|||||||
if !projects.contains_key(project) {
|
if !projects.contains_key(project) {
|
||||||
let available: Vec<&str> = projects.keys().map(|s| s.as_str()).collect();
|
let available: Vec<&str> = projects.keys().map(|s| s.as_str()).collect();
|
||||||
return JsonRpcResponse::error(
|
return JsonRpcResponse::error(
|
||||||
None,
|
id,
|
||||||
-32602,
|
-32602,
|
||||||
format!(
|
format!(
|
||||||
"unknown project '{project}'. Available: {}",
|
"unknown project '{project}'. Available: {}",
|
||||||
@@ -524,7 +530,7 @@ async fn handle_switch_project(params: &Value, state: &GatewayState) -> JsonRpcR
|
|||||||
*state.active_project.write().await = project.to_string();
|
*state.active_project.write().await = project.to_string();
|
||||||
|
|
||||||
JsonRpcResponse::success(
|
JsonRpcResponse::success(
|
||||||
None,
|
id,
|
||||||
json!({
|
json!({
|
||||||
"content": [{
|
"content": [{
|
||||||
"type": "text",
|
"type": "text",
|
||||||
@@ -535,11 +541,11 @@ async fn handle_switch_project(params: &Value, state: &GatewayState) -> JsonRpcR
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// Show pipeline status for the active project by proxying `get_pipeline_status`.
|
/// Show pipeline status for the active project by proxying `get_pipeline_status`.
|
||||||
async fn handle_gateway_status(state: &GatewayState) -> JsonRpcResponse {
|
async fn handle_gateway_status(state: &GatewayState, id: Option<Value>) -> JsonRpcResponse {
|
||||||
let active = state.active_project.read().await.clone();
|
let active = state.active_project.read().await.clone();
|
||||||
let url = match state.active_url().await {
|
let url = match state.active_url().await {
|
||||||
Ok(u) => u,
|
Ok(u) => u,
|
||||||
Err(e) => return JsonRpcResponse::error(None, -32603, e),
|
Err(e) => return JsonRpcResponse::error(id.clone(), -32603, e),
|
||||||
};
|
};
|
||||||
|
|
||||||
let mcp_url = format!("{}/mcp", url.trim_end_matches('/'));
|
let mcp_url = format!("{}/mcp", url.trim_end_matches('/'));
|
||||||
@@ -560,7 +566,7 @@ async fn handle_gateway_status(state: &GatewayState) -> JsonRpcResponse {
|
|||||||
// Extract the result from the upstream response and wrap it.
|
// Extract the result from the upstream response and wrap it.
|
||||||
let pipeline = upstream.get("result").cloned().unwrap_or(json!(null));
|
let pipeline = upstream.get("result").cloned().unwrap_or(json!(null));
|
||||||
JsonRpcResponse::success(
|
JsonRpcResponse::success(
|
||||||
None,
|
id,
|
||||||
json!({
|
json!({
|
||||||
"content": [{
|
"content": [{
|
||||||
"type": "text",
|
"type": "text",
|
||||||
@@ -573,16 +579,16 @@ async fn handle_gateway_status(state: &GatewayState) -> JsonRpcResponse {
|
|||||||
)
|
)
|
||||||
}
|
}
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
JsonRpcResponse::error(None, -32603, format!("invalid upstream response: {e}"))
|
JsonRpcResponse::error(id, -32603, format!("invalid upstream response: {e}"))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Err(e) => JsonRpcResponse::error(None, -32603, format!("failed to reach {mcp_url}: {e}")),
|
Err(e) => JsonRpcResponse::error(id, -32603, format!("failed to reach {mcp_url}: {e}")),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Aggregate health checks across all registered projects.
|
/// Aggregate health checks across all registered projects.
|
||||||
async fn handle_gateway_health(state: &GatewayState) -> JsonRpcResponse {
|
async fn handle_gateway_health(state: &GatewayState, id: Option<Value>) -> JsonRpcResponse {
|
||||||
let mut results = BTreeMap::new();
|
let mut results = BTreeMap::new();
|
||||||
|
|
||||||
let project_entries: Vec<(String, String)> = state
|
let project_entries: Vec<(String, String)> = state
|
||||||
@@ -609,7 +615,7 @@ async fn handle_gateway_health(state: &GatewayState) -> JsonRpcResponse {
|
|||||||
|
|
||||||
let active = state.active_project.read().await.clone();
|
let active = state.active_project.read().await.clone();
|
||||||
JsonRpcResponse::success(
|
JsonRpcResponse::success(
|
||||||
None,
|
id,
|
||||||
json!({
|
json!({
|
||||||
"content": [{
|
"content": [{
|
||||||
"type": "text",
|
"type": "text",
|
||||||
@@ -1104,7 +1110,7 @@ pub async fn gateway_switch_handler(
|
|||||||
body: Json<SwitchRequest>,
|
body: Json<SwitchRequest>,
|
||||||
) -> Response {
|
) -> Response {
|
||||||
let params = json!({ "arguments": { "project": body.project } });
|
let params = json!({ "arguments": { "project": body.project } });
|
||||||
let resp = handle_switch_project(¶ms, &state).await;
|
let resp = handle_switch_project(¶ms, &state, None).await;
|
||||||
|
|
||||||
let (ok, error) = if resp.result.is_some() {
|
let (ok, error) = if resp.result.is_some() {
|
||||||
(true, None)
|
(true, None)
|
||||||
@@ -1404,10 +1410,18 @@ pub async fn gateway_bot_config_save_handler(
|
|||||||
h.abort();
|
h.abort();
|
||||||
}
|
}
|
||||||
let gateway_projects: Vec<String> = state.projects.read().await.keys().cloned().collect();
|
let gateway_projects: Vec<String> = state.projects.read().await.keys().cloned().collect();
|
||||||
|
let gateway_project_urls: std::collections::BTreeMap<String, String> = state
|
||||||
|
.projects
|
||||||
|
.read()
|
||||||
|
.await
|
||||||
|
.iter()
|
||||||
|
.map(|(name, entry)| (name.clone(), entry.url.clone()))
|
||||||
|
.collect();
|
||||||
let new_handle = spawn_gateway_bot(
|
let new_handle = spawn_gateway_bot(
|
||||||
&state.config_dir,
|
&state.config_dir,
|
||||||
Arc::clone(&state.active_project),
|
Arc::clone(&state.active_project),
|
||||||
gateway_projects,
|
gateway_projects,
|
||||||
|
gateway_project_urls,
|
||||||
state.port,
|
state.port,
|
||||||
);
|
);
|
||||||
*handle = new_handle;
|
*handle = new_handle;
|
||||||
@@ -1634,50 +1648,12 @@ pub async fn gateway_bot_config_page_handler() -> Response {
|
|||||||
|
|
||||||
// ── Gateway server startup ───────────────────────────────────────────
|
// ── Gateway server startup ───────────────────────────────────────────
|
||||||
|
|
||||||
/// Start the gateway HTTP server. This is the entry point when `--gateway` is used.
|
/// Build the complete gateway route tree.
|
||||||
pub async fn run(config_path: &Path, port: u16) -> Result<(), std::io::Error> {
|
///
|
||||||
// Locate the gateway config directory (parent of `projects.toml`).
|
/// Extracted from `run` so that tests can construct the full route tree and
|
||||||
let config_dir = config_path
|
/// catch duplicate-route panics before they reach production.
|
||||||
.parent()
|
pub fn build_gateway_route(state_arc: Arc<GatewayState>) -> impl poem::Endpoint {
|
||||||
.unwrap_or(std::path::Path::new("."))
|
poem::Route::new()
|
||||||
.to_path_buf();
|
|
||||||
|
|
||||||
let config = GatewayConfig::load(config_path).map_err(std::io::Error::other)?;
|
|
||||||
let state =
|
|
||||||
GatewayState::new(config, config_dir.clone(), port).map_err(std::io::Error::other)?;
|
|
||||||
let state_arc = Arc::new(state);
|
|
||||||
|
|
||||||
let active = state_arc.active_project.read().await.clone();
|
|
||||||
crate::slog!("[gateway] Starting gateway on port {port}, active project: {active}");
|
|
||||||
crate::slog!(
|
|
||||||
"[gateway] Registered projects: {}",
|
|
||||||
state_arc
|
|
||||||
.projects
|
|
||||||
.read()
|
|
||||||
.await
|
|
||||||
.keys()
|
|
||||||
.cloned()
|
|
||||||
.collect::<Vec<_>>()
|
|
||||||
.join(", ")
|
|
||||||
);
|
|
||||||
|
|
||||||
// Write `.mcp.json` so that the gateway's Matrix bot's Claude Code CLI
|
|
||||||
// connects to this gateway's MCP endpoint (which proxies to the active project).
|
|
||||||
if let Err(e) = write_gateway_mcp_json(&config_dir, port) {
|
|
||||||
crate::slog!("[gateway] Warning: could not write .mcp.json: {e}");
|
|
||||||
}
|
|
||||||
|
|
||||||
// Spawn the Matrix bot if `.huskies/bot.toml` exists in the config directory.
|
|
||||||
let gateway_projects: Vec<String> = state_arc.projects.read().await.keys().cloned().collect();
|
|
||||||
let bot_abort = spawn_gateway_bot(
|
|
||||||
&config_dir,
|
|
||||||
Arc::clone(&state_arc.active_project),
|
|
||||||
gateway_projects,
|
|
||||||
port,
|
|
||||||
);
|
|
||||||
*state_arc.bot_handle.lock().await = bot_abort;
|
|
||||||
|
|
||||||
let route = poem::Route::new()
|
|
||||||
.at("/bot-config", poem::get(gateway_bot_config_page_handler))
|
.at("/bot-config", poem::get(gateway_bot_config_page_handler))
|
||||||
.at("/api/gateway", poem::get(gateway_api_handler))
|
.at("/api/gateway", poem::get(gateway_api_handler))
|
||||||
.at("/api/gateway/switch", poem::post(gateway_switch_handler))
|
.at("/api/gateway/switch", poem::post(gateway_switch_handler))
|
||||||
@@ -1732,7 +1708,61 @@ pub async fn run(config_path: &Path, port: u16) -> Result<(), std::io::Error> {
|
|||||||
)
|
)
|
||||||
.at("/*path", poem::get(crate::http::assets::embedded_file))
|
.at("/*path", poem::get(crate::http::assets::embedded_file))
|
||||||
.at("/", poem::get(crate::http::assets::embedded_index))
|
.at("/", poem::get(crate::http::assets::embedded_index))
|
||||||
.data(state_arc);
|
.data(state_arc)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Start the gateway HTTP server. This is the entry point when `--gateway` is used.
|
||||||
|
pub async fn run(config_path: &Path, port: u16) -> Result<(), std::io::Error> {
|
||||||
|
// Locate the gateway config directory (parent of `projects.toml`).
|
||||||
|
let config_dir = config_path
|
||||||
|
.parent()
|
||||||
|
.unwrap_or(std::path::Path::new("."))
|
||||||
|
.to_path_buf();
|
||||||
|
|
||||||
|
let config = GatewayConfig::load(config_path).map_err(std::io::Error::other)?;
|
||||||
|
let state =
|
||||||
|
GatewayState::new(config, config_dir.clone(), port).map_err(std::io::Error::other)?;
|
||||||
|
let state_arc = Arc::new(state);
|
||||||
|
|
||||||
|
let active = state_arc.active_project.read().await.clone();
|
||||||
|
crate::slog!("[gateway] Starting gateway on port {port}, active project: {active}");
|
||||||
|
crate::slog!(
|
||||||
|
"[gateway] Registered projects: {}",
|
||||||
|
state_arc
|
||||||
|
.projects
|
||||||
|
.read()
|
||||||
|
.await
|
||||||
|
.keys()
|
||||||
|
.cloned()
|
||||||
|
.collect::<Vec<_>>()
|
||||||
|
.join(", ")
|
||||||
|
);
|
||||||
|
|
||||||
|
// Write `.mcp.json` so that the gateway's Matrix bot's Claude Code CLI
|
||||||
|
// connects to this gateway's MCP endpoint (which proxies to the active project).
|
||||||
|
if let Err(e) = write_gateway_mcp_json(&config_dir, port) {
|
||||||
|
crate::slog!("[gateway] Warning: could not write .mcp.json: {e}");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Spawn the Matrix bot if `.huskies/bot.toml` exists in the config directory.
|
||||||
|
let gateway_projects: Vec<String> = state_arc.projects.read().await.keys().cloned().collect();
|
||||||
|
let gateway_project_urls: std::collections::BTreeMap<String, String> = state_arc
|
||||||
|
.projects
|
||||||
|
.read()
|
||||||
|
.await
|
||||||
|
.iter()
|
||||||
|
.map(|(name, entry)| (name.clone(), entry.url.clone()))
|
||||||
|
.collect();
|
||||||
|
let bot_abort = spawn_gateway_bot(
|
||||||
|
&config_dir,
|
||||||
|
Arc::clone(&state_arc.active_project),
|
||||||
|
gateway_projects,
|
||||||
|
gateway_project_urls,
|
||||||
|
port,
|
||||||
|
);
|
||||||
|
*state_arc.bot_handle.lock().await = bot_abort;
|
||||||
|
|
||||||
|
let route = build_gateway_route(state_arc);
|
||||||
|
|
||||||
let host = std::env::var("HUSKIES_HOST").unwrap_or_else(|_| "127.0.0.1".to_string());
|
let host = std::env::var("HUSKIES_HOST").unwrap_or_else(|_| "127.0.0.1".to_string());
|
||||||
let addr = format!("{host}:{port}");
|
let addr = format!("{host}:{port}");
|
||||||
@@ -1777,6 +1807,7 @@ fn spawn_gateway_bot(
|
|||||||
config_dir: &Path,
|
config_dir: &Path,
|
||||||
active_project: ActiveProject,
|
active_project: ActiveProject,
|
||||||
gateway_projects: Vec<String>,
|
gateway_projects: Vec<String>,
|
||||||
|
gateway_project_urls: std::collections::BTreeMap<String, String>,
|
||||||
port: u16,
|
port: u16,
|
||||||
) -> Option<tokio::task::AbortHandle> {
|
) -> Option<tokio::task::AbortHandle> {
|
||||||
use crate::agents::AgentPool;
|
use crate::agents::AgentPool;
|
||||||
@@ -1808,6 +1839,7 @@ fn spawn_gateway_bot(
|
|||||||
shutdown_rx,
|
shutdown_rx,
|
||||||
Some(active_project),
|
Some(active_project),
|
||||||
gateway_projects,
|
gateway_projects,
|
||||||
|
gateway_project_urls,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1899,7 +1931,7 @@ url = "http://localhost:3002"
|
|||||||
let state = GatewayState::new(config, PathBuf::from("."), 3000).unwrap();
|
let state = GatewayState::new(config, PathBuf::from("."), 3000).unwrap();
|
||||||
|
|
||||||
let params = json!({ "arguments": { "project": "beta" } });
|
let params = json!({ "arguments": { "project": "beta" } });
|
||||||
let resp = handle_switch_project(¶ms, &state).await;
|
let resp = handle_switch_project(¶ms, &state, None).await;
|
||||||
assert!(resp.result.is_some());
|
assert!(resp.result.is_some());
|
||||||
|
|
||||||
let active = state.active_project.read().await.clone();
|
let active = state.active_project.read().await.clone();
|
||||||
@@ -1919,7 +1951,7 @@ url = "http://localhost:3002"
|
|||||||
let state = GatewayState::new(config, PathBuf::from("."), 3000).unwrap();
|
let state = GatewayState::new(config, PathBuf::from("."), 3000).unwrap();
|
||||||
|
|
||||||
let params = json!({ "arguments": { "project": "nonexistent" } });
|
let params = json!({ "arguments": { "project": "nonexistent" } });
|
||||||
let resp = handle_switch_project(¶ms, &state).await;
|
let resp = handle_switch_project(¶ms, &state, None).await;
|
||||||
assert!(resp.error.is_some());
|
assert!(resp.error.is_some());
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2260,4 +2292,16 @@ enabled = false
|
|||||||
.await;
|
.await;
|
||||||
assert_eq!(resp.0.status(), StatusCode::NOT_FOUND);
|
assert_eq!(resp.0.status(), StatusCode::NOT_FOUND);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Build the full gateway route tree and verify it does not panic.
|
||||||
|
///
|
||||||
|
/// Poem panics at construction time when duplicate routes are registered.
|
||||||
|
/// This test catches any regression where a duplicate route is re-introduced
|
||||||
|
/// (e.g. the `/` vs `/*path` duplicate fixed in commit 0969fb5d).
|
||||||
|
#[test]
|
||||||
|
fn gateway_route_tree_builds_without_panic() {
|
||||||
|
let state = make_test_state();
|
||||||
|
// build_gateway_route will panic if any route is registered more than once.
|
||||||
|
let _route = build_gateway_route(state);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -402,6 +402,34 @@ impl AgentsApi {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Filesystem miss — fall back to CRDT-only path (story exists in the CRDT
|
||||||
|
// but has no corresponding .md file on disk).
|
||||||
|
if let Some(content) = crate::db::read_content(&story_id.0) {
|
||||||
|
let item = crate::pipeline_state::read_typed(&story_id.0)
|
||||||
|
.map_err(|e| bad_request(format!("Pipeline read error: {e}")))?;
|
||||||
|
let stage = item
|
||||||
|
.as_ref()
|
||||||
|
.map(|i| match &i.stage {
|
||||||
|
crate::pipeline_state::Stage::Backlog => "backlog",
|
||||||
|
crate::pipeline_state::Stage::Coding => "current",
|
||||||
|
crate::pipeline_state::Stage::Qa => "qa",
|
||||||
|
crate::pipeline_state::Stage::Merge { .. } => "merge",
|
||||||
|
crate::pipeline_state::Stage::Done { .. } => "done",
|
||||||
|
crate::pipeline_state::Stage::Archived { .. } => "archived",
|
||||||
|
})
|
||||||
|
.unwrap_or("unknown")
|
||||||
|
.to_string();
|
||||||
|
let metadata = crate::io::story_metadata::parse_front_matter(&content).ok();
|
||||||
|
let name = metadata.as_ref().and_then(|m| m.name.clone());
|
||||||
|
let agent = metadata.and_then(|m| m.agent);
|
||||||
|
return Ok(Json(WorkItemContentResponse {
|
||||||
|
content,
|
||||||
|
stage,
|
||||||
|
name,
|
||||||
|
agent,
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
Err(not_found(format!("Work item not found: {}", story_id.0)))
|
Err(not_found(format!("Work item not found: {}", story_id.0)))
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -953,6 +981,50 @@ allowed_tools = ["Read", "Bash"]
|
|||||||
assert!(result.is_err());
|
assert!(result.is_err());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn get_work_item_content_falls_back_to_crdt_when_no_file() {
|
||||||
|
let tmp = TempDir::new().unwrap();
|
||||||
|
let root = tmp.path().to_path_buf();
|
||||||
|
// Seed content + CRDT with no .md file on disk.
|
||||||
|
crate::db::write_item_with_content(
|
||||||
|
"44_story_crdt_only",
|
||||||
|
"1_backlog",
|
||||||
|
"---\nname: \"CRDT Only\"\n---\n\nCRDT content.",
|
||||||
|
);
|
||||||
|
let ctx = AppContext::new_test(root);
|
||||||
|
let api = AgentsApi { ctx: Arc::new(ctx) };
|
||||||
|
let result = api
|
||||||
|
.get_work_item_content(Path("44_story_crdt_only".to_string()))
|
||||||
|
.await
|
||||||
|
.unwrap()
|
||||||
|
.0;
|
||||||
|
assert!(result.content.contains("CRDT content."));
|
||||||
|
assert_eq!(result.stage, "backlog");
|
||||||
|
assert_eq!(result.name, Some("CRDT Only".to_string()));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn get_work_item_content_crdt_fallback_with_current_stage() {
|
||||||
|
let tmp = TempDir::new().unwrap();
|
||||||
|
let root = tmp.path().to_path_buf();
|
||||||
|
// Seed a CRDT-only story in the coding/current stage.
|
||||||
|
crate::db::write_item_with_content(
|
||||||
|
"45_story_crdt_current",
|
||||||
|
"2_current",
|
||||||
|
"---\nname: \"Current CRDT\"\n---\n\nIn progress.",
|
||||||
|
);
|
||||||
|
let ctx = AppContext::new_test(root);
|
||||||
|
let api = AgentsApi { ctx: Arc::new(ctx) };
|
||||||
|
let result = api
|
||||||
|
.get_work_item_content(Path("45_story_crdt_current".to_string()))
|
||||||
|
.await
|
||||||
|
.unwrap()
|
||||||
|
.0;
|
||||||
|
assert!(result.content.contains("In progress."));
|
||||||
|
assert_eq!(result.stage, "current");
|
||||||
|
assert_eq!(result.name, Some("Current CRDT".to_string()));
|
||||||
|
}
|
||||||
|
|
||||||
#[tokio::test]
|
#[tokio::test]
|
||||||
async fn get_work_item_content_returns_error_when_no_project_root() {
|
async fn get_work_item_content_returns_error_when_no_project_root() {
|
||||||
let tmp = TempDir::new().unwrap();
|
let tmp = TempDir::new().unwrap();
|
||||||
|
|||||||
@@ -349,13 +349,14 @@ pub(super) fn tool_dump_crdt(args: &Value) -> Result<String, String> {
|
|||||||
.map_err(|e| format!("Serialization error: {e}"))
|
.map_err(|e| format!("Serialization error: {e}"))
|
||||||
}
|
}
|
||||||
|
|
||||||
/// MCP tool: return the server version and build hash.
|
/// MCP tool: return the server version, build hash, and running port.
|
||||||
pub(super) fn tool_get_version() -> Result<String, String> {
|
pub(super) fn tool_get_version(ctx: &AppContext) -> Result<String, String> {
|
||||||
let build_hash =
|
let build_hash =
|
||||||
std::fs::read_to_string(".huskies/build_hash").unwrap_or_else(|_| "unknown".to_string());
|
std::fs::read_to_string(".huskies/build_hash").unwrap_or_else(|_| "unknown".to_string());
|
||||||
serde_json::to_string_pretty(&json!({
|
serde_json::to_string_pretty(&json!({
|
||||||
"version": env!("CARGO_PKG_VERSION"),
|
"version": env!("CARGO_PKG_VERSION"),
|
||||||
"build_hash": build_hash.trim(),
|
"build_hash": build_hash.trim(),
|
||||||
|
"port": ctx.agents.port(),
|
||||||
}))
|
}))
|
||||||
.map_err(|e| format!("Serialization error: {e}"))
|
.map_err(|e| format!("Serialization error: {e}"))
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -897,7 +897,7 @@ fn handle_tools_list(id: Option<Value>) -> JsonRpcResponse {
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "get_version",
|
"name": "get_version",
|
||||||
"description": "Return the server version and build hash.",
|
"description": "Return the server version, build hash, and running port.",
|
||||||
"inputSchema": {
|
"inputSchema": {
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {}
|
"properties": {}
|
||||||
@@ -1330,7 +1330,7 @@ async fn handle_tools_call(id: Option<Value>, params: &Value, ctx: &AppContext)
|
|||||||
"get_pipeline_status" => story_tools::tool_get_pipeline_status(ctx),
|
"get_pipeline_status" => story_tools::tool_get_pipeline_status(ctx),
|
||||||
// Diagnostics
|
// Diagnostics
|
||||||
"get_server_logs" => diagnostics::tool_get_server_logs(&args),
|
"get_server_logs" => diagnostics::tool_get_server_logs(&args),
|
||||||
"get_version" => diagnostics::tool_get_version(),
|
"get_version" => diagnostics::tool_get_version(ctx),
|
||||||
// Server lifecycle
|
// Server lifecycle
|
||||||
"rebuild_and_restart" => diagnostics::tool_rebuild_and_restart(ctx).await,
|
"rebuild_and_restart" => diagnostics::tool_rebuild_and_restart(ctx).await,
|
||||||
// Permission bridge (Claude Code → frontend dialog)
|
// Permission bridge (Claude Code → frontend dialog)
|
||||||
|
|||||||
@@ -43,6 +43,8 @@ pub(crate) fn step_output_path(
|
|||||||
.join("STACK.md"),
|
.join("STACK.md"),
|
||||||
),
|
),
|
||||||
WizardStep::TestScript => Some(project_root.join("script").join("test")),
|
WizardStep::TestScript => Some(project_root.join("script").join("test")),
|
||||||
|
WizardStep::BuildScript => Some(project_root.join("script").join("build")),
|
||||||
|
WizardStep::LintScript => Some(project_root.join("script").join("lint")),
|
||||||
WizardStep::ReleaseScript => Some(project_root.join("script").join("release")),
|
WizardStep::ReleaseScript => Some(project_root.join("script").join("release")),
|
||||||
WizardStep::TestCoverage => Some(project_root.join("script").join("test_coverage")),
|
WizardStep::TestCoverage => Some(project_root.join("script").join("test_coverage")),
|
||||||
WizardStep::Scaffold => None,
|
WizardStep::Scaffold => None,
|
||||||
@@ -52,22 +54,35 @@ pub(crate) fn step_output_path(
|
|||||||
pub(crate) fn is_script_step(step: WizardStep) -> bool {
|
pub(crate) fn is_script_step(step: WizardStep) -> bool {
|
||||||
matches!(
|
matches!(
|
||||||
step,
|
step,
|
||||||
WizardStep::TestScript | WizardStep::ReleaseScript | WizardStep::TestCoverage
|
WizardStep::TestScript
|
||||||
|
| WizardStep::BuildScript
|
||||||
|
| WizardStep::LintScript
|
||||||
|
| WizardStep::ReleaseScript
|
||||||
|
| WizardStep::TestCoverage
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Write `content` to `path` only when the file does not already exist.
|
/// Write `content` to `path`, skipping if the file already exists with real
|
||||||
|
/// (non-template) content.
|
||||||
///
|
///
|
||||||
/// Existing files (including `CLAUDE.md`) are never overwritten — the wizard
|
/// Scaffold template files (those containing [`TEMPLATE_SENTINEL`]) are treated
|
||||||
/// appends or skips per the acceptance criteria. For script steps the file is
|
/// as placeholders and will be overwritten with the wizard-generated content.
|
||||||
/// also made executable after writing.
|
/// Files with real user content are never overwritten. For script steps the
|
||||||
|
/// file is also made executable after writing.
|
||||||
pub(crate) fn write_if_missing(
|
pub(crate) fn write_if_missing(
|
||||||
path: &Path,
|
path: &Path,
|
||||||
content: &str,
|
content: &str,
|
||||||
executable: bool,
|
executable: bool,
|
||||||
) -> Result<bool, String> {
|
) -> Result<bool, String> {
|
||||||
|
use crate::io::onboarding::TEMPLATE_SENTINEL;
|
||||||
if path.exists() {
|
if path.exists() {
|
||||||
return Ok(false); // already present — skip silently
|
// Overwrite scaffold template placeholders; preserve real user content.
|
||||||
|
let is_template = std::fs::read_to_string(path)
|
||||||
|
.map(|s| s.contains(TEMPLATE_SENTINEL))
|
||||||
|
.unwrap_or(false);
|
||||||
|
if !is_template {
|
||||||
|
return Ok(false); // real content already present — skip
|
||||||
|
}
|
||||||
}
|
}
|
||||||
if let Some(parent) = path.parent() {
|
if let Some(parent) = path.parent() {
|
||||||
fs::create_dir_all(parent)
|
fs::create_dir_all(parent)
|
||||||
@@ -247,6 +262,90 @@ pub(crate) fn generation_hint(step: WizardStep, project_root: &Path) -> String {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
WizardStep::BuildScript => {
|
||||||
|
if bare {
|
||||||
|
"This is a bare project with no existing code. Read the STACK.md generated \
|
||||||
|
in the previous step (or ask the user about their stack if it was skipped) \
|
||||||
|
and generate a `script/build` shell script (#!/usr/bin/env bash, set -euo pipefail) \
|
||||||
|
with appropriate build commands for their chosen language and framework."
|
||||||
|
.to_string()
|
||||||
|
} else {
|
||||||
|
let has_cargo = project_root.join("Cargo.toml").exists();
|
||||||
|
let has_pkg = project_root.join("package.json").exists();
|
||||||
|
let has_pnpm = project_root.join("pnpm-lock.yaml").exists();
|
||||||
|
let has_frontend_subdir =
|
||||||
|
project_root.join("frontend").join("package.json").exists()
|
||||||
|
|| project_root.join("client").join("package.json").exists();
|
||||||
|
let has_go = project_root.join("go.mod").exists();
|
||||||
|
let mut cmds = Vec::new();
|
||||||
|
if has_cargo {
|
||||||
|
cmds.push("cargo build --release");
|
||||||
|
}
|
||||||
|
if has_pkg {
|
||||||
|
cmds.push(if has_pnpm {
|
||||||
|
"pnpm run build"
|
||||||
|
} else {
|
||||||
|
"npm run build"
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if has_frontend_subdir {
|
||||||
|
cmds.push("(cd frontend && npm run build)");
|
||||||
|
}
|
||||||
|
if has_go {
|
||||||
|
cmds.push("go build ./...");
|
||||||
|
}
|
||||||
|
if cmds.is_empty() {
|
||||||
|
"Generate a `script/build` shell script (#!/usr/bin/env bash, set -euo pipefail) that builds the project.".to_string()
|
||||||
|
} else {
|
||||||
|
format!(
|
||||||
|
"Generate a `script/build` shell script (#!/usr/bin/env bash, set -euo pipefail) that runs: {}",
|
||||||
|
cmds.join(", ")
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
WizardStep::LintScript => {
|
||||||
|
if bare {
|
||||||
|
"This is a bare project with no existing code. Read the STACK.md generated \
|
||||||
|
in the previous step (or ask the user about their stack if it was skipped) \
|
||||||
|
and generate a `script/lint` shell script (#!/usr/bin/env bash, set -euo pipefail) \
|
||||||
|
with appropriate lint commands for their chosen language and framework."
|
||||||
|
.to_string()
|
||||||
|
} else {
|
||||||
|
let has_cargo = project_root.join("Cargo.toml").exists();
|
||||||
|
let has_pkg = project_root.join("package.json").exists();
|
||||||
|
let has_pnpm = project_root.join("pnpm-lock.yaml").exists();
|
||||||
|
let has_python = project_root.join("pyproject.toml").exists()
|
||||||
|
|| project_root.join("requirements.txt").exists();
|
||||||
|
let has_go = project_root.join("go.mod").exists();
|
||||||
|
let mut cmds = Vec::new();
|
||||||
|
if has_cargo {
|
||||||
|
cmds.push("cargo fmt --all --check");
|
||||||
|
cmds.push("cargo clippy -- -D warnings");
|
||||||
|
}
|
||||||
|
if has_pkg {
|
||||||
|
cmds.push(if has_pnpm {
|
||||||
|
"pnpm run lint"
|
||||||
|
} else {
|
||||||
|
"npm run lint"
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if has_python {
|
||||||
|
cmds.push("flake8 . (or ruff check . if ruff is configured)");
|
||||||
|
}
|
||||||
|
if has_go {
|
||||||
|
cmds.push("go vet ./...");
|
||||||
|
}
|
||||||
|
if cmds.is_empty() {
|
||||||
|
"Generate a `script/lint` shell script (#!/usr/bin/env bash, set -euo pipefail) that runs the project's linters.".to_string()
|
||||||
|
} else {
|
||||||
|
format!(
|
||||||
|
"Generate a `script/lint` shell script (#!/usr/bin/env bash, set -euo pipefail) that runs: {}",
|
||||||
|
cmds.join(", ")
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
WizardStep::ReleaseScript => {
|
WizardStep::ReleaseScript => {
|
||||||
if bare {
|
if bare {
|
||||||
"This is a bare project with no existing code. Read the STACK.md generated \
|
"This is a bare project with no existing code. Read the STACK.md generated \
|
||||||
@@ -473,13 +572,13 @@ mod tests {
|
|||||||
fn wizard_confirm_does_not_overwrite_existing_file() {
|
fn wizard_confirm_does_not_overwrite_existing_file() {
|
||||||
let dir = TempDir::new().unwrap();
|
let dir = TempDir::new().unwrap();
|
||||||
let ctx = setup(&dir);
|
let ctx = setup(&dir);
|
||||||
// Pre-create the specs directory and file.
|
// Pre-create the specs directory and file with real (non-template) content.
|
||||||
let specs_dir = dir.path().join(".huskies").join("specs");
|
let specs_dir = dir.path().join(".huskies").join("specs");
|
||||||
std::fs::create_dir_all(&specs_dir).unwrap();
|
std::fs::create_dir_all(&specs_dir).unwrap();
|
||||||
let context_path = specs_dir.join("00_CONTEXT.md");
|
let context_path = specs_dir.join("00_CONTEXT.md");
|
||||||
std::fs::write(&context_path, "original content").unwrap();
|
std::fs::write(&context_path, "original content").unwrap();
|
||||||
|
|
||||||
// Stage and confirm — existing file should NOT be overwritten.
|
// Stage and confirm — existing real file should NOT be overwritten.
|
||||||
tool_wizard_generate(&serde_json::json!({"content": "new content"}), &ctx).unwrap();
|
tool_wizard_generate(&serde_json::json!({"content": "new content"}), &ctx).unwrap();
|
||||||
let result = tool_wizard_confirm(&ctx).unwrap();
|
let result = tool_wizard_confirm(&ctx).unwrap();
|
||||||
assert!(result.contains("already exists"));
|
assert!(result.contains("already exists"));
|
||||||
@@ -489,6 +588,34 @@ mod tests {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn wizard_confirm_overwrites_scaffold_template_file() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
let ctx = setup(&dir);
|
||||||
|
// Pre-create the file with scaffold template placeholder content.
|
||||||
|
let specs_dir = dir.path().join(".huskies").join("specs");
|
||||||
|
std::fs::create_dir_all(&specs_dir).unwrap();
|
||||||
|
let context_path = specs_dir.join("00_CONTEXT.md");
|
||||||
|
std::fs::write(
|
||||||
|
&context_path,
|
||||||
|
"<!-- huskies:scaffold-template -->\n# Project Context\n\nTODO: Describe...",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
// Stage and confirm — template placeholder should be overwritten with generated content.
|
||||||
|
tool_wizard_generate(
|
||||||
|
&serde_json::json!({"content": "# My Real Project\n\nThis is a real project."}),
|
||||||
|
&ctx,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
let result = tool_wizard_confirm(&ctx).unwrap();
|
||||||
|
assert!(result.contains("confirmed"));
|
||||||
|
assert_eq!(
|
||||||
|
std::fs::read_to_string(&context_path).unwrap(),
|
||||||
|
"# My Real Project\n\nThis is a real project."
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn wizard_skip_advances_wizard() {
|
fn wizard_skip_advances_wizard() {
|
||||||
let dir = TempDir::new().unwrap();
|
let dir = TempDir::new().unwrap();
|
||||||
@@ -517,8 +644,8 @@ mod tests {
|
|||||||
fn wizard_complete_returns_done_message() {
|
fn wizard_complete_returns_done_message() {
|
||||||
let dir = TempDir::new().unwrap();
|
let dir = TempDir::new().unwrap();
|
||||||
let ctx = setup(&dir);
|
let ctx = setup(&dir);
|
||||||
// Skip all remaining steps.
|
// Skip all remaining steps (scaffold is pre-confirmed, so 7 remaining).
|
||||||
for _ in 0..5 {
|
for _ in 0..7 {
|
||||||
tool_wizard_skip(&ctx).unwrap();
|
tool_wizard_skip(&ctx).unwrap();
|
||||||
}
|
}
|
||||||
let result = tool_wizard_status(&ctx).unwrap();
|
let result = tool_wizard_status(&ctx).unwrap();
|
||||||
@@ -629,4 +756,61 @@ mod tests {
|
|||||||
assert!(hint.contains("cargo nextest"));
|
assert!(hint.contains("cargo nextest"));
|
||||||
assert!(!hint.contains("bare project"));
|
assert!(!hint.contains("bare project"));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn generation_hint_bare_build_script_references_stack() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
std::fs::create_dir_all(dir.path().join(".huskies")).unwrap();
|
||||||
|
let hint = generation_hint(WizardStep::BuildScript, dir.path());
|
||||||
|
assert!(hint.contains("bare project"));
|
||||||
|
assert!(hint.contains("STACK.md"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn generation_hint_bare_lint_script_references_stack() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
std::fs::create_dir_all(dir.path().join(".huskies")).unwrap();
|
||||||
|
let hint = generation_hint(WizardStep::LintScript, dir.path());
|
||||||
|
assert!(hint.contains("bare project"));
|
||||||
|
assert!(hint.contains("STACK.md"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn generation_hint_existing_project_build_script_detects_cargo() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
std::fs::write(dir.path().join("Cargo.toml"), "[package]").unwrap();
|
||||||
|
let hint = generation_hint(WizardStep::BuildScript, dir.path());
|
||||||
|
assert!(hint.contains("cargo build --release"));
|
||||||
|
assert!(!hint.contains("bare project"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn generation_hint_existing_project_lint_script_detects_cargo() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
std::fs::write(dir.path().join("Cargo.toml"), "[package]").unwrap();
|
||||||
|
let hint = generation_hint(WizardStep::LintScript, dir.path());
|
||||||
|
assert!(hint.contains("cargo fmt --all --check"));
|
||||||
|
assert!(hint.contains("cargo clippy -- -D warnings"));
|
||||||
|
assert!(!hint.contains("bare project"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn step_output_path_build_script_returns_script_build() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
let path = step_output_path(dir.path(), WizardStep::BuildScript).unwrap();
|
||||||
|
assert!(path.ends_with("script/build"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn step_output_path_lint_script_returns_script_lint() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
let path = step_output_path(dir.path(), WizardStep::LintScript).unwrap();
|
||||||
|
assert!(path.ends_with("script/lint"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn is_script_step_includes_build_and_lint() {
|
||||||
|
assert!(is_script_step(WizardStep::BuildScript));
|
||||||
|
assert!(is_script_step(WizardStep::LintScript));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
+411
-1
@@ -1,13 +1,181 @@
|
|||||||
//! HTTP settings endpoints — REST API for user preferences and editor configuration.
|
//! HTTP settings endpoints — REST API for user preferences and editor configuration.
|
||||||
|
use crate::config::ProjectConfig;
|
||||||
use crate::http::context::{AppContext, OpenApiResult, bad_request};
|
use crate::http::context::{AppContext, OpenApiResult, bad_request};
|
||||||
use crate::store::StoreOps;
|
use crate::store::StoreOps;
|
||||||
use poem_openapi::{Object, OpenApi, Tags, param::Query, payload::Json};
|
use poem_openapi::{Object, OpenApi, Tags, param::Query, payload::Json};
|
||||||
use serde::Serialize;
|
use serde::{Deserialize, Serialize};
|
||||||
use serde_json::json;
|
use serde_json::json;
|
||||||
|
use std::path::Path;
|
||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
|
|
||||||
const EDITOR_COMMAND_KEY: &str = "editor_command";
|
const EDITOR_COMMAND_KEY: &str = "editor_command";
|
||||||
|
|
||||||
|
/// Project-level settings exposed via `GET /api/settings` and `PUT /api/settings`.
|
||||||
|
///
|
||||||
|
/// Only contains the scalar fields of `ProjectConfig` — array sections
|
||||||
|
/// (`[[component]]`, `[[agent]]`, `[watcher]`) are preserved in the TOML file
|
||||||
|
/// and are not editable through this API.
|
||||||
|
#[derive(Debug, Object, Serialize, Deserialize)]
|
||||||
|
struct ProjectSettings {
|
||||||
|
/// Project-wide default QA mode: "server", "agent", or "human". Default: "server".
|
||||||
|
default_qa: String,
|
||||||
|
/// Default model for coder-stage agents (e.g. "sonnet"). When set, only agents whose
|
||||||
|
/// model matches this value are used for auto-assignment.
|
||||||
|
default_coder_model: Option<String>,
|
||||||
|
/// Maximum number of concurrent coder-stage agents. When set, stories wait in
|
||||||
|
/// 2_current/ until a slot is free.
|
||||||
|
max_coders: Option<u32>,
|
||||||
|
/// Maximum retries per story per pipeline stage before marking as blocked. Default: 2.
|
||||||
|
max_retries: u32,
|
||||||
|
/// Optional base branch name (e.g. "main", "master"). Overrides auto-detection.
|
||||||
|
base_branch: Option<String>,
|
||||||
|
/// Whether to send RateLimitWarning chat notifications. Default: true.
|
||||||
|
rate_limit_notifications: bool,
|
||||||
|
/// IANA timezone name (e.g. "Europe/London"). Timer inputs are interpreted in this tz.
|
||||||
|
timezone: Option<String>,
|
||||||
|
/// WebSocket URL of a remote huskies node to sync CRDT state with.
|
||||||
|
rendezvous: Option<String>,
|
||||||
|
/// How often (seconds) to check 5_done/ for items to archive. Default: 60.
|
||||||
|
watcher_sweep_interval_secs: u64,
|
||||||
|
/// How long (seconds) an item must remain in 5_done/ before archiving. Default: 14400.
|
||||||
|
watcher_done_retention_secs: u64,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Load `ProjectSettings` from `ProjectConfig`.
|
||||||
|
fn settings_from_config(cfg: &ProjectConfig) -> ProjectSettings {
|
||||||
|
ProjectSettings {
|
||||||
|
default_qa: cfg.default_qa.clone(),
|
||||||
|
default_coder_model: cfg.default_coder_model.clone(),
|
||||||
|
max_coders: cfg.max_coders.map(|v| v as u32),
|
||||||
|
max_retries: cfg.max_retries,
|
||||||
|
base_branch: cfg.base_branch.clone(),
|
||||||
|
rate_limit_notifications: cfg.rate_limit_notifications,
|
||||||
|
timezone: cfg.timezone.clone(),
|
||||||
|
rendezvous: cfg.rendezvous.clone(),
|
||||||
|
watcher_sweep_interval_secs: cfg.watcher.sweep_interval_secs,
|
||||||
|
watcher_done_retention_secs: cfg.watcher.done_retention_secs,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Validate the incoming `ProjectSettings` before writing.
|
||||||
|
fn validate_project_settings(s: &ProjectSettings) -> Result<(), String> {
|
||||||
|
match s.default_qa.as_str() {
|
||||||
|
"server" | "agent" | "human" => {}
|
||||||
|
other => {
|
||||||
|
return Err(format!(
|
||||||
|
"Invalid default_qa value '{other}'. Must be one of: server, agent, human"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Write only the scalar settings from `s` into the project.toml at the given root.
|
||||||
|
/// Array sections (`[[component]]`, `[[agent]]`) are preserved unchanged.
|
||||||
|
fn write_project_settings(project_root: &Path, s: &ProjectSettings) -> Result<(), String> {
|
||||||
|
let config_path = project_root.join(".huskies/project.toml");
|
||||||
|
|
||||||
|
let content = if config_path.exists() {
|
||||||
|
std::fs::read_to_string(&config_path).map_err(|e| format!("Read config: {e}"))?
|
||||||
|
} else {
|
||||||
|
String::new()
|
||||||
|
};
|
||||||
|
|
||||||
|
let mut val: toml::Value = if content.trim().is_empty() {
|
||||||
|
toml::Value::Table(toml::map::Map::new())
|
||||||
|
} else {
|
||||||
|
toml::from_str(&content).map_err(|e| format!("Parse config: {e}"))?
|
||||||
|
};
|
||||||
|
|
||||||
|
let table = val
|
||||||
|
.as_table_mut()
|
||||||
|
.ok_or_else(|| "Config is not a TOML table".to_string())?;
|
||||||
|
|
||||||
|
// Scalar root fields
|
||||||
|
table.insert(
|
||||||
|
"default_qa".to_string(),
|
||||||
|
toml::Value::String(s.default_qa.clone()),
|
||||||
|
);
|
||||||
|
table.insert(
|
||||||
|
"max_retries".to_string(),
|
||||||
|
toml::Value::Integer(s.max_retries as i64),
|
||||||
|
);
|
||||||
|
table.insert(
|
||||||
|
"rate_limit_notifications".to_string(),
|
||||||
|
toml::Value::Boolean(s.rate_limit_notifications),
|
||||||
|
);
|
||||||
|
|
||||||
|
// Optional scalar fields
|
||||||
|
match &s.default_coder_model {
|
||||||
|
Some(v) => {
|
||||||
|
table.insert(
|
||||||
|
"default_coder_model".to_string(),
|
||||||
|
toml::Value::String(v.clone()),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
None => {
|
||||||
|
table.remove("default_coder_model");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
match s.max_coders {
|
||||||
|
Some(v) => {
|
||||||
|
table.insert("max_coders".to_string(), toml::Value::Integer(v as i64));
|
||||||
|
}
|
||||||
|
None => {
|
||||||
|
table.remove("max_coders");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
match &s.base_branch {
|
||||||
|
Some(v) => {
|
||||||
|
table.insert("base_branch".to_string(), toml::Value::String(v.clone()));
|
||||||
|
}
|
||||||
|
None => {
|
||||||
|
table.remove("base_branch");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
match &s.timezone {
|
||||||
|
Some(v) => {
|
||||||
|
table.insert("timezone".to_string(), toml::Value::String(v.clone()));
|
||||||
|
}
|
||||||
|
None => {
|
||||||
|
table.remove("timezone");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
match &s.rendezvous {
|
||||||
|
Some(v) => {
|
||||||
|
table.insert("rendezvous".to_string(), toml::Value::String(v.clone()));
|
||||||
|
}
|
||||||
|
None => {
|
||||||
|
table.remove("rendezvous");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// [watcher] sub-table
|
||||||
|
let watcher_entry = table
|
||||||
|
.entry("watcher".to_string())
|
||||||
|
.or_insert_with(|| toml::Value::Table(toml::map::Map::new()));
|
||||||
|
if let toml::Value::Table(wt) = watcher_entry {
|
||||||
|
wt.insert(
|
||||||
|
"sweep_interval_secs".to_string(),
|
||||||
|
toml::Value::Integer(s.watcher_sweep_interval_secs as i64),
|
||||||
|
);
|
||||||
|
wt.insert(
|
||||||
|
"done_retention_secs".to_string(),
|
||||||
|
toml::Value::Integer(s.watcher_done_retention_secs as i64),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure .huskies/ directory exists
|
||||||
|
if let Some(parent) = config_path.parent() {
|
||||||
|
std::fs::create_dir_all(parent).map_err(|e| format!("Create .huskies dir: {e}"))?;
|
||||||
|
}
|
||||||
|
|
||||||
|
let new_content = toml::to_string_pretty(&val).map_err(|e| format!("Serialize config: {e}"))?;
|
||||||
|
std::fs::write(&config_path, new_content).map_err(|e| format!("Write config: {e}"))?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
#[derive(Tags)]
|
#[derive(Tags)]
|
||||||
enum SettingsTags {
|
enum SettingsTags {
|
||||||
Settings,
|
Settings,
|
||||||
@@ -71,6 +239,30 @@ impl SettingsApi {
|
|||||||
Ok(Json(OpenFileResponse { success: true }))
|
Ok(Json(OpenFileResponse { success: true }))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Get current project.toml scalar settings as JSON.
|
||||||
|
#[oai(path = "/settings", method = "get")]
|
||||||
|
async fn get_settings(&self) -> OpenApiResult<Json<ProjectSettings>> {
|
||||||
|
let project_root = self.ctx.state.get_project_root().map_err(bad_request)?;
|
||||||
|
let config = ProjectConfig::load(&project_root).map_err(bad_request)?;
|
||||||
|
Ok(Json(settings_from_config(&config)))
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Update project.toml scalar settings. Array sections (component, agent) are preserved.
|
||||||
|
///
|
||||||
|
/// Returns 400 if the input fails validation (e.g. unknown qa mode, negative max_retries).
|
||||||
|
#[oai(path = "/settings", method = "put")]
|
||||||
|
async fn put_settings(
|
||||||
|
&self,
|
||||||
|
payload: Json<ProjectSettings>,
|
||||||
|
) -> OpenApiResult<Json<ProjectSettings>> {
|
||||||
|
validate_project_settings(&payload.0).map_err(bad_request)?;
|
||||||
|
let project_root = self.ctx.state.get_project_root().map_err(bad_request)?;
|
||||||
|
write_project_settings(&project_root, &payload.0).map_err(bad_request)?;
|
||||||
|
// Re-read to confirm what was written
|
||||||
|
let config = ProjectConfig::load(&project_root).map_err(bad_request)?;
|
||||||
|
Ok(Json(settings_from_config(&config)))
|
||||||
|
}
|
||||||
|
|
||||||
/// Set the preferred editor command (e.g. "zed", "code", "cursor").
|
/// Set the preferred editor command (e.g. "zed", "code", "cursor").
|
||||||
/// Pass null or empty string to clear the preference.
|
/// Pass null or empty string to clear the preference.
|
||||||
#[oai(path = "/settings/editor", method = "put")]
|
#[oai(path = "/settings/editor", method = "put")]
|
||||||
@@ -360,4 +552,222 @@ mod tests {
|
|||||||
.await;
|
.await;
|
||||||
assert!(result.is_err());
|
assert!(result.is_err());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ── /api/settings GET/PUT ──────────────────────────────────────────────
|
||||||
|
|
||||||
|
fn default_project_settings() -> ProjectSettings {
|
||||||
|
let cfg = ProjectConfig::default();
|
||||||
|
settings_from_config(&cfg)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn get_settings_returns_defaults_when_no_project_toml() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
// Create .huskies dir so project root detection works but no project.toml
|
||||||
|
std::fs::create_dir_all(dir.path().join(".huskies")).unwrap();
|
||||||
|
let ctx = AppContext::new_test(dir.path().to_path_buf());
|
||||||
|
let api = SettingsApi { ctx: Arc::new(ctx) };
|
||||||
|
let result = api.get_settings().await.unwrap().0;
|
||||||
|
assert_eq!(result.default_qa, "server");
|
||||||
|
assert_eq!(result.max_retries, 2);
|
||||||
|
assert!(result.rate_limit_notifications);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn put_settings_writes_and_returns_settings() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
std::fs::create_dir_all(dir.path().join(".huskies")).unwrap();
|
||||||
|
let ctx = AppContext::new_test(dir.path().to_path_buf());
|
||||||
|
let api = SettingsApi { ctx: Arc::new(ctx) };
|
||||||
|
|
||||||
|
let mut s = default_project_settings();
|
||||||
|
s.default_qa = "agent".to_string();
|
||||||
|
s.max_retries = 5;
|
||||||
|
s.rate_limit_notifications = false;
|
||||||
|
|
||||||
|
let result = api.put_settings(Json(s)).await.unwrap().0;
|
||||||
|
assert_eq!(result.default_qa, "agent");
|
||||||
|
assert_eq!(result.max_retries, 5);
|
||||||
|
assert!(!result.rate_limit_notifications);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn put_settings_preserves_agent_sections() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
let huskies_dir = dir.path().join(".huskies");
|
||||||
|
std::fs::create_dir_all(&huskies_dir).unwrap();
|
||||||
|
|
||||||
|
// Write a project.toml with agent sections
|
||||||
|
std::fs::write(
|
||||||
|
huskies_dir.join("project.toml"),
|
||||||
|
r#"
|
||||||
|
[[agent]]
|
||||||
|
name = "coder-1"
|
||||||
|
model = "sonnet"
|
||||||
|
stage = "coder"
|
||||||
|
|
||||||
|
[[component]]
|
||||||
|
name = "server"
|
||||||
|
path = "."
|
||||||
|
"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let ctx = AppContext::new_test(dir.path().to_path_buf());
|
||||||
|
let api = SettingsApi { ctx: Arc::new(ctx) };
|
||||||
|
|
||||||
|
let mut s = default_project_settings();
|
||||||
|
s.default_qa = "human".to_string();
|
||||||
|
api.put_settings(Json(s)).await.unwrap();
|
||||||
|
|
||||||
|
// Re-read the file and verify agent/component sections are still there
|
||||||
|
let written = std::fs::read_to_string(huskies_dir.join("project.toml")).unwrap();
|
||||||
|
assert!(
|
||||||
|
written.contains("coder-1"),
|
||||||
|
"agent section should be preserved"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
written.contains("server"),
|
||||||
|
"component section should be preserved"
|
||||||
|
);
|
||||||
|
assert!(written.contains("human"), "new setting should be written");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn put_settings_rejects_invalid_qa_mode() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
std::fs::create_dir_all(dir.path().join(".huskies")).unwrap();
|
||||||
|
let ctx = AppContext::new_test(dir.path().to_path_buf());
|
||||||
|
let api = SettingsApi { ctx: Arc::new(ctx) };
|
||||||
|
|
||||||
|
let mut s = default_project_settings();
|
||||||
|
s.default_qa = "invalid_mode".to_string();
|
||||||
|
|
||||||
|
let result = api.put_settings(Json(s)).await;
|
||||||
|
assert!(result.is_err());
|
||||||
|
let err = result.unwrap_err();
|
||||||
|
assert_eq!(err.status(), poem::http::StatusCode::BAD_REQUEST);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn validate_project_settings_accepts_valid_qa_modes() {
|
||||||
|
for mode in &["server", "agent", "human"] {
|
||||||
|
let s = ProjectSettings {
|
||||||
|
default_qa: mode.to_string(),
|
||||||
|
default_coder_model: None,
|
||||||
|
max_coders: None,
|
||||||
|
max_retries: 2,
|
||||||
|
base_branch: None,
|
||||||
|
rate_limit_notifications: true,
|
||||||
|
timezone: None,
|
||||||
|
rendezvous: None,
|
||||||
|
watcher_sweep_interval_secs: 60,
|
||||||
|
watcher_done_retention_secs: 14400,
|
||||||
|
};
|
||||||
|
assert!(
|
||||||
|
validate_project_settings(&s).is_ok(),
|
||||||
|
"qa mode '{mode}' should be valid"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn validate_project_settings_rejects_unknown_qa_mode() {
|
||||||
|
let s = ProjectSettings {
|
||||||
|
default_qa: "robot".to_string(),
|
||||||
|
default_coder_model: None,
|
||||||
|
max_coders: None,
|
||||||
|
max_retries: 2,
|
||||||
|
base_branch: None,
|
||||||
|
rate_limit_notifications: true,
|
||||||
|
timezone: None,
|
||||||
|
rendezvous: None,
|
||||||
|
watcher_sweep_interval_secs: 60,
|
||||||
|
watcher_done_retention_secs: 14400,
|
||||||
|
};
|
||||||
|
let err = validate_project_settings(&s).unwrap_err();
|
||||||
|
assert!(err.contains("robot"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn write_and_read_project_settings_roundtrip() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
std::fs::create_dir_all(dir.path().join(".huskies")).unwrap();
|
||||||
|
|
||||||
|
let s = ProjectSettings {
|
||||||
|
default_qa: "agent".to_string(),
|
||||||
|
default_coder_model: Some("opus".to_string()),
|
||||||
|
max_coders: Some(2),
|
||||||
|
max_retries: 3,
|
||||||
|
base_branch: Some("main".to_string()),
|
||||||
|
rate_limit_notifications: false,
|
||||||
|
timezone: Some("America/New_York".to_string()),
|
||||||
|
rendezvous: Some("ws://host:3001/crdt-sync".to_string()),
|
||||||
|
watcher_sweep_interval_secs: 30,
|
||||||
|
watcher_done_retention_secs: 7200,
|
||||||
|
};
|
||||||
|
|
||||||
|
write_project_settings(dir.path(), &s).unwrap();
|
||||||
|
|
||||||
|
let config = ProjectConfig::load(dir.path()).unwrap();
|
||||||
|
let loaded = settings_from_config(&config);
|
||||||
|
|
||||||
|
assert_eq!(loaded.default_qa, "agent");
|
||||||
|
assert_eq!(loaded.default_coder_model, Some("opus".to_string()));
|
||||||
|
assert_eq!(loaded.max_coders, Some(2));
|
||||||
|
assert_eq!(loaded.max_retries, 3);
|
||||||
|
assert_eq!(loaded.base_branch, Some("main".to_string()));
|
||||||
|
assert!(!loaded.rate_limit_notifications);
|
||||||
|
assert_eq!(loaded.timezone, Some("America/New_York".to_string()));
|
||||||
|
assert_eq!(
|
||||||
|
loaded.rendezvous,
|
||||||
|
Some("ws://host:3001/crdt-sync".to_string())
|
||||||
|
);
|
||||||
|
assert_eq!(loaded.watcher_sweep_interval_secs, 30);
|
||||||
|
assert_eq!(loaded.watcher_done_retention_secs, 7200);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn write_project_settings_clears_optional_fields_when_none() {
|
||||||
|
let dir = TempDir::new().unwrap();
|
||||||
|
let huskies_dir = dir.path().join(".huskies");
|
||||||
|
std::fs::create_dir_all(&huskies_dir).unwrap();
|
||||||
|
|
||||||
|
// First write with optional fields set
|
||||||
|
let s_with = ProjectSettings {
|
||||||
|
default_qa: "server".to_string(),
|
||||||
|
default_coder_model: Some("sonnet".to_string()),
|
||||||
|
max_coders: Some(3),
|
||||||
|
max_retries: 2,
|
||||||
|
base_branch: Some("master".to_string()),
|
||||||
|
rate_limit_notifications: true,
|
||||||
|
timezone: Some("UTC".to_string()),
|
||||||
|
rendezvous: None,
|
||||||
|
watcher_sweep_interval_secs: 60,
|
||||||
|
watcher_done_retention_secs: 14400,
|
||||||
|
};
|
||||||
|
write_project_settings(dir.path(), &s_with).unwrap();
|
||||||
|
|
||||||
|
// Then write with optional fields cleared
|
||||||
|
let s_clear = ProjectSettings {
|
||||||
|
default_qa: "server".to_string(),
|
||||||
|
default_coder_model: None,
|
||||||
|
max_coders: None,
|
||||||
|
max_retries: 2,
|
||||||
|
base_branch: None,
|
||||||
|
rate_limit_notifications: true,
|
||||||
|
timezone: None,
|
||||||
|
rendezvous: None,
|
||||||
|
watcher_sweep_interval_secs: 60,
|
||||||
|
watcher_done_retention_secs: 14400,
|
||||||
|
};
|
||||||
|
write_project_settings(dir.path(), &s_clear).unwrap();
|
||||||
|
|
||||||
|
let config = ProjectConfig::load(dir.path()).unwrap();
|
||||||
|
let loaded = settings_from_config(&config);
|
||||||
|
assert!(loaded.default_coder_model.is_none());
|
||||||
|
assert!(loaded.max_coders.is_none());
|
||||||
|
assert!(loaded.base_branch.is_none());
|
||||||
|
assert!(loaded.timezone.is_none());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -195,7 +195,7 @@ mod tests {
|
|||||||
let body: serde_json::Value = resp.0.into_body().into_json().await.unwrap();
|
let body: serde_json::Value = resp.0.into_body().into_json().await.unwrap();
|
||||||
assert_eq!(body["current_step_index"], 1);
|
assert_eq!(body["current_step_index"], 1);
|
||||||
assert!(!body["completed"].as_bool().unwrap());
|
assert!(!body["completed"].as_bool().unwrap());
|
||||||
assert_eq!(body["steps"].as_array().unwrap().len(), 6);
|
assert_eq!(body["steps"].as_array().unwrap().len(), 8);
|
||||||
assert_eq!(body["steps"][0]["status"], "confirmed");
|
assert_eq!(body["steps"][0]["status"], "confirmed");
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -279,11 +279,13 @@ mod tests {
|
|||||||
let (dir, client) = setup();
|
let (dir, client) = setup();
|
||||||
WizardState::init_if_missing(dir.path());
|
WizardState::init_if_missing(dir.path());
|
||||||
|
|
||||||
// Steps 2-6 (scaffold is already confirmed)
|
// Steps 2-8 (scaffold is already confirmed)
|
||||||
let steps = [
|
let steps = [
|
||||||
"context",
|
"context",
|
||||||
"stack",
|
"stack",
|
||||||
"test_script",
|
"test_script",
|
||||||
|
"build_script",
|
||||||
|
"lint_script",
|
||||||
"release_script",
|
"release_script",
|
||||||
"test_coverage",
|
"test_coverage",
|
||||||
];
|
];
|
||||||
|
|||||||
+21
-11
@@ -37,6 +37,13 @@ pub(crate) async fn ensure_project_root_with_story_kit(
|
|||||||
if !path.join(".huskies").is_dir() {
|
if !path.join(".huskies").is_dir() {
|
||||||
scaffold_story_kit(&path, port)?;
|
scaffold_story_kit(&path, port)?;
|
||||||
}
|
}
|
||||||
|
// Always update .mcp.json with the current port so the bot connects to
|
||||||
|
// the right endpoint even when HUSKIES_PORT changes between restarts.
|
||||||
|
let mcp_content = format!(
|
||||||
|
"{{\n \"mcpServers\": {{\n \"huskies\": {{\n \"type\": \"http\",\n \"url\": \"http://localhost:{port}/mcp\"\n }}\n }}\n}}\n"
|
||||||
|
);
|
||||||
|
fs::write(path.join(".mcp.json"), mcp_content)
|
||||||
|
.map_err(|e| format!("Failed to write .mcp.json: {}", e))?;
|
||||||
Ok(())
|
Ok(())
|
||||||
})
|
})
|
||||||
.await
|
.await
|
||||||
@@ -194,16 +201,15 @@ mod tests {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[tokio::test]
|
#[tokio::test]
|
||||||
async fn open_project_does_not_overwrite_existing_mcp_json() {
|
async fn open_project_updates_mcp_json_with_current_port() {
|
||||||
// scaffold must NOT overwrite .mcp.json when it already exists — QA
|
// .mcp.json must always be updated with the actual running port so the
|
||||||
// test servers share the real project root, and re-writing would
|
// bot connects to the right MCP endpoint even when HUSKIES_PORT changes.
|
||||||
// clobber the file with the wrong port.
|
|
||||||
let dir = tempdir().unwrap();
|
let dir = tempdir().unwrap();
|
||||||
let project_dir = dir.path().join("myproject");
|
let project_dir = dir.path().join("myproject");
|
||||||
fs::create_dir_all(&project_dir).unwrap();
|
fs::create_dir_all(&project_dir).unwrap();
|
||||||
// Pre-write .mcp.json with a different port to simulate an already-configured project.
|
// Pre-write .mcp.json with a different port to simulate a stale file.
|
||||||
let mcp_path = project_dir.join(".mcp.json");
|
let mcp_path = project_dir.join(".mcp.json");
|
||||||
fs::write(&mcp_path, "{\"existing\": true}").unwrap();
|
fs::write(&mcp_path, "{\"stale\": true}").unwrap();
|
||||||
let store = make_store(&dir);
|
let store = make_store(&dir);
|
||||||
let state = SessionState::default();
|
let state = SessionState::default();
|
||||||
|
|
||||||
@@ -211,15 +217,19 @@ mod tests {
|
|||||||
project_dir.to_string_lossy().to_string(),
|
project_dir.to_string_lossy().to_string(),
|
||||||
&state,
|
&state,
|
||||||
&store,
|
&store,
|
||||||
3001,
|
3002,
|
||||||
)
|
)
|
||||||
.await
|
.await
|
||||||
.unwrap();
|
.unwrap();
|
||||||
|
|
||||||
assert_eq!(
|
let content = fs::read_to_string(&mcp_path).unwrap();
|
||||||
fs::read_to_string(&mcp_path).unwrap(),
|
assert!(
|
||||||
"{\"existing\": true}",
|
content.contains("3002"),
|
||||||
"open_project must not overwrite an existing .mcp.json"
|
"open_project must update .mcp.json with the actual running port"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
content.contains("localhost"),
|
||||||
|
"mcp.json must reference localhost"
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -100,6 +100,24 @@ const DEFAULT_PROJECT_SETTINGS_TOML: &str = r#"# Project-wide default QA mode: "
|
|||||||
# Per-story `qa` front matter overrides this setting.
|
# Per-story `qa` front matter overrides this setting.
|
||||||
default_qa = "server"
|
default_qa = "server"
|
||||||
|
|
||||||
|
# Maximum number of retries per story per pipeline stage before marking as blocked.
|
||||||
|
# Set to 0 to disable retry limits.
|
||||||
|
max_retries = 2
|
||||||
|
|
||||||
|
# Default model for coder-stage agents (e.g. "sonnet", "opus").
|
||||||
|
# When set, only coder agents whose model matches this value are considered for
|
||||||
|
# auto-assignment, so opus agents are only used when explicitly requested via
|
||||||
|
# story front matter `agent:` field.
|
||||||
|
# default_coder_model = "sonnet"
|
||||||
|
|
||||||
|
# Maximum number of concurrent coder-stage agents.
|
||||||
|
# Stories wait in 2_current/ until a slot frees up.
|
||||||
|
# max_coders = 3
|
||||||
|
|
||||||
|
# Override the base branch for worktree creation and merge operations.
|
||||||
|
# When not set, the system auto-detects the base branch from the current HEAD.
|
||||||
|
# base_branch = "main"
|
||||||
|
|
||||||
# Suppress soft rate-limit warning notifications in chat.
|
# Suppress soft rate-limit warning notifications in chat.
|
||||||
# Hard blocks and story-blocked notifications are always sent.
|
# Hard blocks and story-blocked notifications are always sent.
|
||||||
# rate_limit_notifications = true
|
# rate_limit_notifications = true
|
||||||
@@ -199,33 +217,202 @@ pub fn detect_components_toml(root: &Path) -> String {
|
|||||||
sections.join("\n")
|
sections.join("\n")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Detect the appropriate Node.js test command for a directory containing `package.json`.
|
||||||
|
///
|
||||||
|
/// Reads the `package.json` content to identify known test runners (vitest, jest).
|
||||||
|
/// Falls back to `npm test` or `pnpm test` based on which lock file is present.
|
||||||
|
fn detect_node_test_cmd(pkg_dir: &Path) -> String {
|
||||||
|
let has_pnpm = pkg_dir.join("pnpm-lock.yaml").exists();
|
||||||
|
let content = std::fs::read_to_string(pkg_dir.join("package.json")).unwrap_or_default();
|
||||||
|
|
||||||
|
if content.contains("\"vitest\"") {
|
||||||
|
let pm = if has_pnpm { "pnpm" } else { "npx" };
|
||||||
|
return format!("{} vitest run", pm);
|
||||||
|
}
|
||||||
|
if content.contains("\"jest\"") {
|
||||||
|
let pm = if has_pnpm { "pnpm" } else { "npx" };
|
||||||
|
return format!("{} jest", pm);
|
||||||
|
}
|
||||||
|
|
||||||
|
if has_pnpm {
|
||||||
|
"pnpm test".to_string()
|
||||||
|
} else {
|
||||||
|
"npm test".to_string()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Detect the appropriate Node.js build command for a directory containing `package.json`.
|
||||||
|
fn detect_node_build_cmd(pkg_dir: &Path) -> String {
|
||||||
|
if pkg_dir.join("pnpm-lock.yaml").exists() {
|
||||||
|
"pnpm run build".to_string()
|
||||||
|
} else {
|
||||||
|
"npm run build".to_string()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Detect the appropriate Node.js lint command for a directory containing `package.json`.
|
||||||
|
///
|
||||||
|
/// Reads the `package.json` content to identify eslint. Falls back to
|
||||||
|
/// `npm run lint` or `pnpm run lint` based on which lock file is present.
|
||||||
|
fn detect_node_lint_cmd(pkg_dir: &Path) -> String {
|
||||||
|
let has_pnpm = pkg_dir.join("pnpm-lock.yaml").exists();
|
||||||
|
let content = std::fs::read_to_string(pkg_dir.join("package.json")).unwrap_or_default();
|
||||||
|
if content.contains("\"eslint\"") {
|
||||||
|
let pm = if has_pnpm { "pnpm" } else { "npx" };
|
||||||
|
return format!("{pm} eslint .");
|
||||||
|
}
|
||||||
|
if has_pnpm {
|
||||||
|
"pnpm run lint".to_string()
|
||||||
|
} else {
|
||||||
|
"npm run lint".to_string()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Generate `script/build` content for a new project at `root`.
|
||||||
|
///
|
||||||
|
/// Inspects well-known marker files to identify which tech stacks are present
|
||||||
|
/// and emits the appropriate build commands. Multi-stack projects get combined
|
||||||
|
/// commands run sequentially. Falls back to a generic stub when no markers
|
||||||
|
/// are found so the scaffold is always valid.
|
||||||
|
///
|
||||||
|
/// For projects with a frontend in a known subdirectory (`frontend/`, `client/`),
|
||||||
|
/// the build command is detected from the presence of `pnpm-lock.yaml`.
|
||||||
|
pub fn detect_script_build(root: &Path) -> String {
|
||||||
|
let mut commands: Vec<String> = Vec::new();
|
||||||
|
|
||||||
|
if root.join("Cargo.toml").exists() {
|
||||||
|
commands.push("cargo build --release".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
if root.join("package.json").exists() {
|
||||||
|
commands.push(detect_node_build_cmd(root));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Detect frontend in known subdirectories (e.g. frontend/, client/)
|
||||||
|
for subdir in &["frontend", "client"] {
|
||||||
|
let sub_path = root.join(subdir);
|
||||||
|
if sub_path.join("package.json").exists() {
|
||||||
|
let cmd = detect_node_build_cmd(&sub_path);
|
||||||
|
commands.push(format!("(cd {} && {})", subdir, cmd));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if root.join("pyproject.toml").exists() {
|
||||||
|
commands.push("python -m build".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
if root.join("go.mod").exists() {
|
||||||
|
commands.push("go build ./...".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
if commands.is_empty() {
|
||||||
|
return "#!/usr/bin/env bash\nset -euo pipefail\n\n# Add your project's build commands here.\necho \"No build configured\"\n".to_string();
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut script = "#!/usr/bin/env bash\nset -euo pipefail\n\n".to_string();
|
||||||
|
for cmd in commands {
|
||||||
|
script.push_str(&cmd);
|
||||||
|
script.push('\n');
|
||||||
|
}
|
||||||
|
script
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Generate `script/lint` content for a new project at `root`.
|
||||||
|
///
|
||||||
|
/// Inspects well-known marker files to identify which linters are present
|
||||||
|
/// and emits the appropriate lint commands. Multi-stack projects get combined
|
||||||
|
/// commands run sequentially. Falls back to a generic stub when no markers
|
||||||
|
/// are found so the scaffold is always valid.
|
||||||
|
///
|
||||||
|
/// For projects with a frontend in a known subdirectory (`frontend/`, `client/`),
|
||||||
|
/// the lint command is detected from the `package.json` (eslint, npm, pnpm).
|
||||||
|
pub fn detect_script_lint(root: &Path) -> String {
|
||||||
|
let mut commands: Vec<String> = Vec::new();
|
||||||
|
|
||||||
|
if root.join("Cargo.toml").exists() {
|
||||||
|
commands.push("cargo fmt --all --check".to_string());
|
||||||
|
commands.push("cargo clippy -- -D warnings".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
if root.join("package.json").exists() {
|
||||||
|
commands.push(detect_node_lint_cmd(root));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Detect frontend in known subdirectories (e.g. frontend/, client/)
|
||||||
|
for subdir in &["frontend", "client"] {
|
||||||
|
let sub_path = root.join(subdir);
|
||||||
|
if sub_path.join("package.json").exists() {
|
||||||
|
let cmd = detect_node_lint_cmd(&sub_path);
|
||||||
|
commands.push(format!("(cd {} && {})", subdir, cmd));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if root.join("pyproject.toml").exists() || root.join("requirements.txt").exists() {
|
||||||
|
let mut content = std::fs::read_to_string(root.join("pyproject.toml")).unwrap_or_default();
|
||||||
|
content
|
||||||
|
.push_str(&std::fs::read_to_string(root.join("requirements.txt")).unwrap_or_default());
|
||||||
|
if content.contains("ruff") {
|
||||||
|
commands.push("ruff check .".to_string());
|
||||||
|
} else {
|
||||||
|
commands.push("flake8 .".to_string());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if root.join("go.mod").exists() {
|
||||||
|
commands.push("go vet ./...".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
if commands.is_empty() {
|
||||||
|
return "#!/usr/bin/env bash\nset -euo pipefail\n\n# Add your project's lint commands here.\necho \"No linters configured\"\n".to_string();
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut script = "#!/usr/bin/env bash\nset -euo pipefail\n\n".to_string();
|
||||||
|
for cmd in commands {
|
||||||
|
script.push_str(&cmd);
|
||||||
|
script.push('\n');
|
||||||
|
}
|
||||||
|
script
|
||||||
|
}
|
||||||
|
|
||||||
/// Generate `script/test` content for a new project at `root`.
|
/// Generate `script/test` content for a new project at `root`.
|
||||||
///
|
///
|
||||||
/// Inspects well-known marker files to identify which tech stacks are present
|
/// Inspects well-known marker files to identify which tech stacks are present
|
||||||
/// and emits the appropriate test commands. Multi-stack projects get combined
|
/// and emits the appropriate test commands. Multi-stack projects get combined
|
||||||
/// commands run sequentially. Falls back to the generic stub when no markers
|
/// commands run sequentially. Falls back to the generic stub when no markers
|
||||||
/// are found so the scaffold is always valid.
|
/// are found so the scaffold is always valid.
|
||||||
|
///
|
||||||
|
/// For projects with a frontend in a known subdirectory (`frontend/`, `client/`),
|
||||||
|
/// the test runner is detected from the `package.json` (vitest, jest, npm, pnpm).
|
||||||
pub fn detect_script_test(root: &Path) -> String {
|
pub fn detect_script_test(root: &Path) -> String {
|
||||||
let mut commands: Vec<&str> = Vec::new();
|
let mut commands: Vec<String> = Vec::new();
|
||||||
|
|
||||||
if root.join("Cargo.toml").exists() {
|
if root.join("Cargo.toml").exists() {
|
||||||
commands.push("cargo test");
|
commands.push("cargo test".to_string());
|
||||||
}
|
}
|
||||||
|
|
||||||
if root.join("package.json").exists() {
|
if root.join("package.json").exists() {
|
||||||
if root.join("pnpm-lock.yaml").exists() {
|
if root.join("pnpm-lock.yaml").exists() {
|
||||||
commands.push("pnpm test");
|
commands.push("pnpm test".to_string());
|
||||||
} else {
|
} else {
|
||||||
commands.push("npm test");
|
commands.push("npm test".to_string());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Detect frontend in known subdirectories (e.g. frontend/, client/)
|
||||||
|
for subdir in &["frontend", "client"] {
|
||||||
|
let sub_path = root.join(subdir);
|
||||||
|
if sub_path.join("package.json").exists() {
|
||||||
|
let cmd = detect_node_test_cmd(&sub_path);
|
||||||
|
commands.push(format!("(cd {} && {})", subdir, cmd));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if root.join("pyproject.toml").exists() || root.join("requirements.txt").exists() {
|
if root.join("pyproject.toml").exists() || root.join("requirements.txt").exists() {
|
||||||
commands.push("pytest");
|
commands.push("pytest".to_string());
|
||||||
}
|
}
|
||||||
|
|
||||||
if root.join("go.mod").exists() {
|
if root.join("go.mod").exists() {
|
||||||
commands.push("go test ./...");
|
commands.push("go test ./...".to_string());
|
||||||
}
|
}
|
||||||
|
|
||||||
if commands.is_empty() {
|
if commands.is_empty() {
|
||||||
@@ -234,7 +421,7 @@ pub fn detect_script_test(root: &Path) -> String {
|
|||||||
|
|
||||||
let mut script = "#!/usr/bin/env bash\nset -euo pipefail\n\n".to_string();
|
let mut script = "#!/usr/bin/env bash\nset -euo pipefail\n\n".to_string();
|
||||||
for cmd in commands {
|
for cmd in commands {
|
||||||
script.push_str(cmd);
|
script.push_str(&cmd);
|
||||||
script.push('\n');
|
script.push('\n');
|
||||||
}
|
}
|
||||||
script
|
script
|
||||||
@@ -298,6 +485,8 @@ fn write_story_kit_gitignore(root: &Path) -> Result<(), String> {
|
|||||||
"token_usage.jsonl",
|
"token_usage.jsonl",
|
||||||
"wizard_state.json",
|
"wizard_state.json",
|
||||||
"store.json",
|
"store.json",
|
||||||
|
"pipeline.db",
|
||||||
|
"*.db",
|
||||||
];
|
];
|
||||||
|
|
||||||
let gitignore_path = root.join(".huskies").join(".gitignore");
|
let gitignore_path = root.join(".huskies").join(".gitignore");
|
||||||
@@ -411,6 +600,10 @@ pub(crate) fn scaffold_story_kit(root: &Path, port: u16) -> Result<(), String> {
|
|||||||
write_file_if_missing(&tech_root.join("STACK.md"), STORY_KIT_STACK)?;
|
write_file_if_missing(&tech_root.join("STACK.md"), STORY_KIT_STACK)?;
|
||||||
let script_test_content = detect_script_test(root);
|
let script_test_content = detect_script_test(root);
|
||||||
write_script_if_missing(&script_root.join("test"), &script_test_content)?;
|
write_script_if_missing(&script_root.join("test"), &script_test_content)?;
|
||||||
|
let script_build_content = detect_script_build(root);
|
||||||
|
write_script_if_missing(&script_root.join("build"), &script_build_content)?;
|
||||||
|
let script_lint_content = detect_script_lint(root);
|
||||||
|
write_script_if_missing(&script_root.join("lint"), &script_lint_content)?;
|
||||||
write_file_if_missing(&root.join("CLAUDE.md"), STORY_KIT_CLAUDE_MD)?;
|
write_file_if_missing(&root.join("CLAUDE.md"), STORY_KIT_CLAUDE_MD)?;
|
||||||
|
|
||||||
// Write per-transport bot.toml example files so users can see all options.
|
// Write per-transport bot.toml example files so users can see all options.
|
||||||
@@ -584,6 +777,78 @@ mod tests {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn scaffold_project_toml_contains_max_retries_with_default_value() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
scaffold_story_kit(dir.path(), 3001).unwrap();
|
||||||
|
|
||||||
|
let content = fs::read_to_string(dir.path().join(".huskies/project.toml")).unwrap();
|
||||||
|
assert!(
|
||||||
|
content.contains("max_retries = 2"),
|
||||||
|
"project.toml scaffold should include max_retries with default value 2"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
content.contains("Maximum number of retries"),
|
||||||
|
"project.toml scaffold should include a comment explaining max_retries"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn scaffold_project_toml_contains_commented_out_optional_fields() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
scaffold_story_kit(dir.path(), 3001).unwrap();
|
||||||
|
|
||||||
|
let content = fs::read_to_string(dir.path().join(".huskies/project.toml")).unwrap();
|
||||||
|
assert!(
|
||||||
|
content.contains("# default_coder_model"),
|
||||||
|
"project.toml scaffold should include commented-out default_coder_model"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
content.contains("# max_coders"),
|
||||||
|
"project.toml scaffold should include commented-out max_coders"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
content.contains("# base_branch"),
|
||||||
|
"project.toml scaffold should include commented-out base_branch"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn scaffold_project_toml_round_trips_through_project_config_load() {
|
||||||
|
use crate::config::ProjectConfig;
|
||||||
|
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
scaffold_story_kit(dir.path(), 3001).unwrap();
|
||||||
|
|
||||||
|
// The generated project.toml must parse without error.
|
||||||
|
let config = ProjectConfig::load(dir.path())
|
||||||
|
.expect("Generated project.toml should parse without error");
|
||||||
|
|
||||||
|
// Key defaults must survive the round-trip.
|
||||||
|
assert_eq!(config.default_qa, "server");
|
||||||
|
assert_eq!(config.max_retries, 2);
|
||||||
|
assert!(
|
||||||
|
config.rate_limit_notifications,
|
||||||
|
"rate_limit_notifications should default to true"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
config.default_coder_model.is_none(),
|
||||||
|
"default_coder_model should be None when commented out"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
config.max_coders.is_none(),
|
||||||
|
"max_coders should be None when commented out"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
config.base_branch.is_none(),
|
||||||
|
"base_branch should be None when commented out"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
config.timezone.is_none(),
|
||||||
|
"timezone should be None when commented out"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn scaffold_context_is_blank_template_not_story_kit_content() {
|
fn scaffold_context_is_blank_template_not_story_kit_content() {
|
||||||
let dir = tempdir().unwrap();
|
let dir = tempdir().unwrap();
|
||||||
@@ -744,6 +1009,9 @@ mod tests {
|
|||||||
assert!(!root_content.contains(".huskies/coverage/"));
|
assert!(!root_content.contains(".huskies/coverage/"));
|
||||||
// store.json must be in .huskies/.gitignore instead
|
// store.json must be in .huskies/.gitignore instead
|
||||||
assert!(sk_content.contains("store.json"));
|
assert!(sk_content.contains("store.json"));
|
||||||
|
// Database files must be ignored so novice users don't accidentally commit them
|
||||||
|
assert!(sk_content.contains("pipeline.db"));
|
||||||
|
assert!(sk_content.contains("*.db"));
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
@@ -1165,6 +1433,141 @@ mod tests {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_test_frontend_subdir_with_vitest_uses_npx_vitest() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(
|
||||||
|
frontend.join("package.json"),
|
||||||
|
r#"{"devDependencies":{"vitest":"^1.0.0"},"scripts":{"test":"vitest run"}}"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_test(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("vitest run"),
|
||||||
|
"frontend with vitest should emit vitest run"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
script.contains("cd frontend"),
|
||||||
|
"should cd into the frontend directory"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
!script.contains("No tests configured"),
|
||||||
|
"should not use stub when frontend is detected"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_test_frontend_subdir_with_jest_uses_npx_jest() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(
|
||||||
|
frontend.join("package.json"),
|
||||||
|
r#"{"devDependencies":{"jest":"^29.0.0"},"scripts":{"test":"jest"}}"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_test(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("jest"),
|
||||||
|
"frontend with jest should emit jest"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
script.contains("cd frontend"),
|
||||||
|
"should cd into the frontend directory"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_test_frontend_subdir_no_known_runner_uses_npm_test() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(
|
||||||
|
frontend.join("package.json"),
|
||||||
|
r#"{"scripts":{"test":"mocha"}}"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_test(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("npm test"),
|
||||||
|
"frontend without known runner should fall back to npm test"
|
||||||
|
);
|
||||||
|
assert!(script.contains("cd frontend"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_test_frontend_subdir_pnpm_uses_pnpm_vitest() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(
|
||||||
|
frontend.join("package.json"),
|
||||||
|
r#"{"devDependencies":{"vitest":"^1.0.0"}}"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
fs::write(frontend.join("pnpm-lock.yaml"), "").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_test(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("pnpm vitest run"),
|
||||||
|
"pnpm frontend with vitest should use pnpm vitest run"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_test_rust_plus_frontend_subdir_both_included() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("Cargo.toml"),
|
||||||
|
"[package]\nname = \"server\"\n",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(
|
||||||
|
frontend.join("package.json"),
|
||||||
|
r#"{"devDependencies":{"vitest":"^1.0.0"}}"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_test(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("cargo test"),
|
||||||
|
"Rust + frontend should include cargo test"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
script.contains("vitest run"),
|
||||||
|
"Rust + frontend should include vitest run"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
script.contains("cd frontend"),
|
||||||
|
"Rust + frontend should cd into frontend"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_test_client_subdir_detected() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let client = dir.path().join("client");
|
||||||
|
fs::create_dir_all(&client).unwrap();
|
||||||
|
fs::write(
|
||||||
|
client.join("package.json"),
|
||||||
|
r#"{"scripts":{"test":"jest"}}"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_test(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("cd client"),
|
||||||
|
"client/ subdir should also be detected"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn detect_script_test_output_starts_with_shebang() {
|
fn detect_script_test_output_starts_with_shebang() {
|
||||||
let dir = tempdir().unwrap();
|
let dir = tempdir().unwrap();
|
||||||
@@ -1211,6 +1614,347 @@ mod tests {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// --- detect_script_build ---
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_build_no_markers_returns_stub() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let script = detect_script_build(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("No build configured"),
|
||||||
|
"fallback should contain the generic stub message"
|
||||||
|
);
|
||||||
|
assert!(script.starts_with("#!/usr/bin/env bash"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_build_cargo_toml_adds_cargo_build_release() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(dir.path().join("Cargo.toml"), "[package]\nname = \"x\"\n").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_build(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("cargo build --release"),
|
||||||
|
"Rust project should run cargo build --release"
|
||||||
|
);
|
||||||
|
assert!(!script.contains("No build configured"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_build_package_json_npm_adds_npm_run_build() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(dir.path().join("package.json"), "{}").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_build(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("npm run build"),
|
||||||
|
"Node project without pnpm-lock should run npm run build"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_build_package_json_pnpm_adds_pnpm_run_build() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(dir.path().join("package.json"), "{}").unwrap();
|
||||||
|
fs::write(dir.path().join("pnpm-lock.yaml"), "").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_build(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("pnpm run build"),
|
||||||
|
"Node project with pnpm-lock should run pnpm run build"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
!script.lines().any(|l| l.trim() == "npm run build"),
|
||||||
|
"should not use npm when pnpm-lock.yaml is present"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_build_go_mod_adds_go_build() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(dir.path().join("go.mod"), "module example.com/app\n").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_build(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("go build ./..."),
|
||||||
|
"Go project should run go build ./..."
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_build_pyproject_toml_adds_python_build() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("pyproject.toml"),
|
||||||
|
"[project]\nname = \"x\"\n",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_build(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("python -m build"),
|
||||||
|
"Python project should run python -m build"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_build_frontend_subdir_detected() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(frontend.join("package.json"), "{}").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_build(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("cd frontend"),
|
||||||
|
"frontend subdir should be detected for build"
|
||||||
|
);
|
||||||
|
assert!(script.contains("npm run build"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_build_rust_plus_frontend_subdir_both_included() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("Cargo.toml"),
|
||||||
|
"[package]\nname = \"server\"\n",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(frontend.join("package.json"), "{}").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_build(dir.path());
|
||||||
|
assert!(script.contains("cargo build --release"));
|
||||||
|
assert!(script.contains("cd frontend"));
|
||||||
|
assert!(script.contains("npm run build"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// --- detect_script_lint ---
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_no_markers_returns_stub() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("No linters configured"),
|
||||||
|
"fallback should contain the generic stub message"
|
||||||
|
);
|
||||||
|
assert!(script.starts_with("#!/usr/bin/env bash"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_cargo_toml_adds_fmt_and_clippy() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(dir.path().join("Cargo.toml"), "[package]\nname = \"x\"\n").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("cargo fmt --all --check"),
|
||||||
|
"Rust project should check formatting"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
script.contains("cargo clippy -- -D warnings"),
|
||||||
|
"Rust project should run clippy"
|
||||||
|
);
|
||||||
|
assert!(!script.contains("No linters configured"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_package_json_without_eslint_uses_npm_run_lint() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(dir.path().join("package.json"), "{}").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("npm run lint"),
|
||||||
|
"Node project without eslint dep should fall back to npm run lint"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_package_json_with_eslint_uses_npx_eslint() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("package.json"),
|
||||||
|
r#"{"devDependencies":{"eslint":"^8.0.0"}}"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("npx eslint ."),
|
||||||
|
"Node project with eslint should use npx eslint ."
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_pnpm_with_eslint_uses_pnpm_eslint() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("package.json"),
|
||||||
|
r#"{"devDependencies":{"eslint":"^8.0.0"}}"#,
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
fs::write(dir.path().join("pnpm-lock.yaml"), "").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("pnpm eslint ."),
|
||||||
|
"pnpm project with eslint should use pnpm eslint ."
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_python_requirements_uses_flake8() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(dir.path().join("requirements.txt"), "flask\n").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("flake8 ."),
|
||||||
|
"Python project without ruff should use flake8"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_python_with_ruff_uses_ruff() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("pyproject.toml"),
|
||||||
|
"[project]\nname = \"x\"\n\n[tool.ruff]\n",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("ruff check ."),
|
||||||
|
"Python project with ruff configured should use ruff"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
!script.contains("flake8"),
|
||||||
|
"should not use flake8 when ruff is configured"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_go_mod_adds_go_vet() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(dir.path().join("go.mod"), "module example.com/app\n").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("go vet ./..."),
|
||||||
|
"Go project should run go vet ./..."
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_frontend_subdir_detected() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(frontend.join("package.json"), "{}").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(
|
||||||
|
script.contains("cd frontend"),
|
||||||
|
"frontend subdir should be detected for lint"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn detect_script_lint_rust_plus_frontend_subdir_both_included() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("Cargo.toml"),
|
||||||
|
"[package]\nname = \"server\"\n",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
let frontend = dir.path().join("frontend");
|
||||||
|
fs::create_dir_all(&frontend).unwrap();
|
||||||
|
fs::write(frontend.join("package.json"), "{}").unwrap();
|
||||||
|
|
||||||
|
let script = detect_script_lint(dir.path());
|
||||||
|
assert!(script.contains("cargo fmt --all --check"));
|
||||||
|
assert!(script.contains("cargo clippy -- -D warnings"));
|
||||||
|
assert!(script.contains("cd frontend"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn scaffold_story_kit_creates_script_build_and_lint() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
scaffold_story_kit(dir.path(), 3001).unwrap();
|
||||||
|
|
||||||
|
assert!(
|
||||||
|
dir.path().join("script/build").exists(),
|
||||||
|
"script/build should be created by scaffold"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
dir.path().join("script/lint").exists(),
|
||||||
|
"script/lint should be created by scaffold"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(unix)]
|
||||||
|
#[test]
|
||||||
|
fn scaffold_story_kit_creates_executable_script_build_and_lint() {
|
||||||
|
use std::os::unix::fs::PermissionsExt;
|
||||||
|
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
scaffold_story_kit(dir.path(), 3001).unwrap();
|
||||||
|
|
||||||
|
for name in &["build", "lint"] {
|
||||||
|
let path = dir.path().join("script").join(name);
|
||||||
|
assert!(path.exists(), "script/{name} should be created");
|
||||||
|
let perms = fs::metadata(&path).unwrap().permissions();
|
||||||
|
assert!(
|
||||||
|
perms.mode() & 0o111 != 0,
|
||||||
|
"script/{name} should be executable"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn scaffold_script_build_contains_detected_commands_for_rust() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("Cargo.toml"),
|
||||||
|
"[package]\nname = \"myapp\"\n",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
scaffold_story_kit(dir.path(), 3001).unwrap();
|
||||||
|
|
||||||
|
let content = fs::read_to_string(dir.path().join("script/build")).unwrap();
|
||||||
|
assert!(
|
||||||
|
content.contains("cargo build --release"),
|
||||||
|
"Rust project scaffold should set cargo build --release in script/build"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn scaffold_script_lint_contains_detected_commands_for_rust() {
|
||||||
|
let dir = tempdir().unwrap();
|
||||||
|
fs::write(
|
||||||
|
dir.path().join("Cargo.toml"),
|
||||||
|
"[package]\nname = \"myapp\"\n",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
scaffold_story_kit(dir.path(), 3001).unwrap();
|
||||||
|
|
||||||
|
let content = fs::read_to_string(dir.path().join("script/lint")).unwrap();
|
||||||
|
assert!(
|
||||||
|
content.contains("cargo fmt --all --check"),
|
||||||
|
"Rust project scaffold should include fmt check in script/lint"
|
||||||
|
);
|
||||||
|
assert!(
|
||||||
|
content.contains("cargo clippy -- -D warnings"),
|
||||||
|
"Rust project scaffold should include clippy in script/lint"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
// --- generate_project_toml ---
|
// --- generate_project_toml ---
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
|
|||||||
@@ -5,7 +5,7 @@ use std::path::Path;
|
|||||||
/// Only untouched templates contain this marker — real project content
|
/// Only untouched templates contain this marker — real project content
|
||||||
/// will never include it, so it avoids false positives when the project
|
/// will never include it, so it avoids false positives when the project
|
||||||
/// itself is an "Agentic AI Code Assistant".
|
/// itself is an "Agentic AI Code Assistant".
|
||||||
const TEMPLATE_SENTINEL: &str = "<!-- huskies:scaffold-template -->";
|
pub(crate) const TEMPLATE_SENTINEL: &str = "<!-- huskies:scaffold-template -->";
|
||||||
|
|
||||||
/// Marker found in the default `script/test` scaffold output.
|
/// Marker found in the default `script/test` scaffold output.
|
||||||
const TEMPLATE_MARKER_SCRIPT: &str = "No tests configured";
|
const TEMPLATE_MARKER_SCRIPT: &str = "No tests configured";
|
||||||
|
|||||||
+11
-3
@@ -16,9 +16,13 @@ pub enum WizardStep {
|
|||||||
Stack,
|
Stack,
|
||||||
/// Step 4: create script/test
|
/// Step 4: create script/test
|
||||||
TestScript,
|
TestScript,
|
||||||
/// Step 5: create script/release
|
/// Step 5: create script/build
|
||||||
|
BuildScript,
|
||||||
|
/// Step 6: create script/lint
|
||||||
|
LintScript,
|
||||||
|
/// Step 7: create script/release
|
||||||
ReleaseScript,
|
ReleaseScript,
|
||||||
/// Step 6: create script/test_coverage
|
/// Step 8: create script/test_coverage
|
||||||
TestCoverage,
|
TestCoverage,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -29,6 +33,8 @@ impl WizardStep {
|
|||||||
WizardStep::Context,
|
WizardStep::Context,
|
||||||
WizardStep::Stack,
|
WizardStep::Stack,
|
||||||
WizardStep::TestScript,
|
WizardStep::TestScript,
|
||||||
|
WizardStep::BuildScript,
|
||||||
|
WizardStep::LintScript,
|
||||||
WizardStep::ReleaseScript,
|
WizardStep::ReleaseScript,
|
||||||
WizardStep::TestCoverage,
|
WizardStep::TestCoverage,
|
||||||
];
|
];
|
||||||
@@ -40,6 +46,8 @@ impl WizardStep {
|
|||||||
WizardStep::Context => "Generate project context (00_CONTEXT.md)",
|
WizardStep::Context => "Generate project context (00_CONTEXT.md)",
|
||||||
WizardStep::Stack => "Generate tech stack spec (STACK.md)",
|
WizardStep::Stack => "Generate tech stack spec (STACK.md)",
|
||||||
WizardStep::TestScript => "Create test script (script/test)",
|
WizardStep::TestScript => "Create test script (script/test)",
|
||||||
|
WizardStep::BuildScript => "Create build script (script/build)",
|
||||||
|
WizardStep::LintScript => "Create lint script (script/lint)",
|
||||||
WizardStep::ReleaseScript => "Create release script (script/release)",
|
WizardStep::ReleaseScript => "Create release script (script/release)",
|
||||||
WizardStep::TestCoverage => "Create test coverage script (script/test_coverage)",
|
WizardStep::TestCoverage => "Create test coverage script (script/test_coverage)",
|
||||||
}
|
}
|
||||||
@@ -262,7 +270,7 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn default_state_has_all_steps_pending() {
|
fn default_state_has_all_steps_pending() {
|
||||||
let state = WizardState::default();
|
let state = WizardState::default();
|
||||||
assert_eq!(state.steps.len(), 6);
|
assert_eq!(state.steps.len(), 8);
|
||||||
for step in &state.steps {
|
for step in &state.steps {
|
||||||
assert_eq!(step.status, StepStatus::Pending);
|
assert_eq!(step.status, StepStatus::Pending);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -868,6 +868,7 @@ async fn main() -> Result<(), std::io::Error> {
|
|||||||
matrix_shutdown_rx,
|
matrix_shutdown_rx,
|
||||||
None,
|
None,
|
||||||
vec![],
|
vec![],
|
||||||
|
std::collections::BTreeMap::new(),
|
||||||
);
|
);
|
||||||
} else {
|
} else {
|
||||||
// Keep the receiver alive (drop it) so the sender never errors.
|
// Keep the receiver alive (drop it) so the sender never errors.
|
||||||
|
|||||||
Reference in New Issue
Block a user