Compare commits
12 Commits
v0.3.2
...
f550018987
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f550018987 | ||
|
|
52ec989c3a | ||
|
|
d080e8b12d | ||
|
|
cfd85d3a0e | ||
|
|
070d53068e | ||
|
|
fa8e0f39f6 | ||
|
|
503fa6b7bf | ||
|
|
51a0fb8297 | ||
|
|
8ac85a0b67 | ||
|
|
aa4e042e32 | ||
|
|
9352443555 | ||
|
|
1faacd7812 |
@@ -60,7 +60,16 @@
|
|||||||
"Edit",
|
"Edit",
|
||||||
"Write",
|
"Write",
|
||||||
"Bash(find *)",
|
"Bash(find *)",
|
||||||
"Bash(sqlite3 *)"
|
"Bash(sqlite3 *)",
|
||||||
|
"Bash(cat <<:*)",
|
||||||
|
"Bash(cat <<'ENDJSON:*)",
|
||||||
|
"Bash(make release:*)",
|
||||||
|
"Bash(npm test:*)",
|
||||||
|
"Bash(head *)",
|
||||||
|
"Bash(tail *)",
|
||||||
|
"Bash(wc *)",
|
||||||
|
"Bash(npx vite:*)",
|
||||||
|
"Bash(npm run dev:*)"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -2,6 +2,15 @@
|
|||||||
|
|
||||||
Recurring issues observed during pipeline operation. Review periodically and create stories for systemic problems.
|
Recurring issues observed during pipeline operation. Review periodically and create stories for systemic problems.
|
||||||
|
|
||||||
|
## 2026-03-18: Stories graduating to "done" with empty merges
|
||||||
|
|
||||||
|
Pipeline allows stories to move through coding → QA → merge → done without any actual code changes landing on master. The squash-merge produces an empty diff but the pipeline still marks the story as done. Confirmed affected: 247, 273, 280. Stories 274, 278, 279 appeared empty via merge commits but code was actually committed directly to master by agents (see next problem). Root cause: no check that the merge commit contains a non-empty diff before advancing to done. Frequency: 3+ confirmed cases out of 10 done stories.
|
||||||
|
|
||||||
## 2026-03-18: Agent committed directly to master instead of worktree
|
## 2026-03-18: Agent committed directly to master instead of worktree
|
||||||
|
|
||||||
Commit `5f4591f` ("fix: update should_commit_stage test to match 5_done") was made directly on master by an agent (likely mergemaster). Agents should only commit to their feature branch or merge-queue branch, never to master directly. The commit content was correct but the target branch was wrong. Suspect the agent ran `git commit` in the project root instead of the merge worktree directory.
|
Multiple agents have committed directly to master instead of their worktree/feature branch:
|
||||||
|
|
||||||
|
- Commit `5f4591f` ("fix: update should_commit_stage test to match 5_done") — likely mergemaster
|
||||||
|
- Commit `a32cfbd` ("Add bot-level command registry with help command") — story 285 coder committed code + Cargo.lock directly to master
|
||||||
|
|
||||||
|
Agents should only commit to their feature branch or merge-queue branch, never to master directly. Suspect agents are running `git commit` in the project root instead of the worktree directory. This can also revert uncommitted fixes on master (e.g. project.toml pkill fix was overwritten). Frequency: at least 2 confirmed cases. This is a recurring and serious problem — needs a guard in the server or agent prompts.
|
||||||
|
|||||||
@@ -1,21 +0,0 @@
|
|||||||
---
|
|
||||||
name: "Show server logs in web UI"
|
|
||||||
---
|
|
||||||
|
|
||||||
# Story 292: Show server logs in web UI
|
|
||||||
|
|
||||||
## User Story
|
|
||||||
|
|
||||||
As a project owner using the web UI, I want to see live server logs in the interface, so that I can debug agent behavior and pipeline issues without needing terminal access.
|
|
||||||
|
|
||||||
## Acceptance Criteria
|
|
||||||
|
|
||||||
- [ ] Web UI has a server logs panel accessible from the main interface
|
|
||||||
- [ ] Logs stream in real-time via WebSocket or SSE
|
|
||||||
- [ ] Logs can be filtered by keyword (same as get_server_logs MCP tool's filter param)
|
|
||||||
- [ ] Log entries show timestamp and severity level
|
|
||||||
- [ ] Panel doesn't interfere with the existing pipeline board and work item views
|
|
||||||
|
|
||||||
## Out of Scope
|
|
||||||
|
|
||||||
- TBD
|
|
||||||
@@ -0,0 +1,25 @@
|
|||||||
|
---
|
||||||
|
name: "Human QA gate with rejection flow"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Story 247: Human QA gate with rejection flow
|
||||||
|
|
||||||
|
## User Story
|
||||||
|
|
||||||
|
As the project owner, I want stories to require my manual approval after machine QA before they can be merged, so that features that compile and pass tests but do not actually work correctly are caught before reaching master.
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
|
||||||
|
- [ ] Story files support a manual_qa front matter field (defaults to true)
|
||||||
|
- [ ] After machine QA passes in 3_qa, stories with manual_qa: true wait for human approval before moving to 4_merge
|
||||||
|
- [ ] The UI shows a clear way to launch the app from the worktree for manual testing (single button click), with automatic port conflict handling via .story_kit_port
|
||||||
|
- [ ] Frontend and backend are pre-compiled during machine QA so the app is ready to run instantly for manual testing
|
||||||
|
- [ ] Only one QA app instance runs at a time — do not automatically spin up multiple instances
|
||||||
|
- [ ] Human can approve a story from 3_qa to move it to 4_merge
|
||||||
|
- [ ] Human can reject a story from 3_qa back to 2_current with notes about what is broken
|
||||||
|
- [ ] Rejection notes are written into the story file so the coder can see what needs fixing
|
||||||
|
- [ ] Stories with manual_qa: false skip the human gate and proceed directly from machine QA to 4_merge
|
||||||
|
|
||||||
|
## Out of Scope
|
||||||
|
|
||||||
|
- TBD
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
---
|
||||||
|
name: "Show test results in work item detail panel"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Story 291: Show test results in work item detail panel
|
||||||
|
|
||||||
|
## User Story
|
||||||
|
|
||||||
|
As a project owner viewing a work item in the web UI, I want to see the most recent test run results in the expanded detail panel, so that I can quickly see pass/fail status without digging through agent logs.
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
|
||||||
|
- [ ] Expanded work item detail panel shows the most recent test results for that story
|
||||||
|
- [ ] Test results display pass/fail counts for unit and integration tests
|
||||||
|
- [ ] Failed tests are listed by name so you can see what broke
|
||||||
|
- [ ] Test results are read from the story file's ## Test Results section (already written by record_tests MCP tool)
|
||||||
|
- [ ] Panel shows a clear empty state when no test results exist yet
|
||||||
|
|
||||||
|
## Out of Scope
|
||||||
|
|
||||||
|
- TBD
|
||||||
34
Cargo.lock
generated
34
Cargo.lock
generated
@@ -4026,7 +4026,7 @@ dependencies = [
|
|||||||
"tempfile",
|
"tempfile",
|
||||||
"tokio",
|
"tokio",
|
||||||
"tokio-tungstenite 0.29.0",
|
"tokio-tungstenite 0.29.0",
|
||||||
"toml 1.0.6+spec-1.1.0",
|
"toml 1.0.7+spec-1.1.0",
|
||||||
"uuid",
|
"uuid",
|
||||||
"wait-timeout",
|
"wait-timeout",
|
||||||
"walkdir",
|
"walkdir",
|
||||||
@@ -4367,22 +4367,22 @@ dependencies = [
|
|||||||
"serde_spanned",
|
"serde_spanned",
|
||||||
"toml_datetime 0.7.5+spec-1.1.0",
|
"toml_datetime 0.7.5+spec-1.1.0",
|
||||||
"toml_parser",
|
"toml_parser",
|
||||||
"winnow",
|
"winnow 0.7.14",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "toml"
|
name = "toml"
|
||||||
version = "1.0.6+spec-1.1.0"
|
version = "1.0.7+spec-1.1.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "399b1124a3c9e16766831c6bba21e50192572cdd98706ea114f9502509686ffc"
|
checksum = "dd28d57d8a6f6e458bc0b8784f8fdcc4b99a437936056fa122cb234f18656a96"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"indexmap",
|
"indexmap",
|
||||||
"serde_core",
|
"serde_core",
|
||||||
"serde_spanned",
|
"serde_spanned",
|
||||||
"toml_datetime 1.0.0+spec-1.1.0",
|
"toml_datetime 1.0.1+spec-1.1.0",
|
||||||
"toml_parser",
|
"toml_parser",
|
||||||
"toml_writer",
|
"toml_writer",
|
||||||
"winnow",
|
"winnow 1.0.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -4396,9 +4396,9 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "toml_datetime"
|
name = "toml_datetime"
|
||||||
version = "1.0.0+spec-1.1.0"
|
version = "1.0.1+spec-1.1.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "32c2555c699578a4f59f0cc68e5116c8d7cabbd45e1409b989d4be085b53f13e"
|
checksum = "9b320e741db58cac564e26c607d3cc1fdc4a88fd36c879568c07856ed83ff3e9"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"serde_core",
|
"serde_core",
|
||||||
]
|
]
|
||||||
@@ -4412,23 +4412,23 @@ dependencies = [
|
|||||||
"indexmap",
|
"indexmap",
|
||||||
"toml_datetime 0.7.5+spec-1.1.0",
|
"toml_datetime 0.7.5+spec-1.1.0",
|
||||||
"toml_parser",
|
"toml_parser",
|
||||||
"winnow",
|
"winnow 0.7.14",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "toml_parser"
|
name = "toml_parser"
|
||||||
version = "1.0.9+spec-1.1.0"
|
version = "1.0.10+spec-1.1.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "702d4415e08923e7e1ef96cd5727c0dfed80b4d2fa25db9647fe5eb6f7c5a4c4"
|
checksum = "7df25b4befd31c4816df190124375d5a20c6b6921e2cad937316de3fccd63420"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"winnow",
|
"winnow 1.0.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "toml_writer"
|
name = "toml_writer"
|
||||||
version = "1.0.6+spec-1.1.0"
|
version = "1.0.7+spec-1.1.0"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "ab16f14aed21ee8bfd8ec22513f7287cd4a91aa92e44edfe2c17ddd004e92607"
|
checksum = "f17aaa1c6e3dc22b1da4b6bba97d066e354c7945cac2f7852d4e4e7ca7a6b56d"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "tower"
|
name = "tower"
|
||||||
@@ -5444,6 +5444,12 @@ dependencies = [
|
|||||||
"memchr",
|
"memchr",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "winnow"
|
||||||
|
version = "1.0.0"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "a90e88e4667264a994d34e6d1ab2d26d398dcdca8b7f52bec8668957517fc7d8"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "winreg"
|
name = "winreg"
|
||||||
version = "0.10.1"
|
version = "0.10.1"
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ serde_yaml = "0.9"
|
|||||||
strip-ansi-escapes = "0.2"
|
strip-ansi-escapes = "0.2"
|
||||||
tempfile = "3"
|
tempfile = "3"
|
||||||
tokio = { version = "1", features = ["rt-multi-thread", "macros", "sync"] }
|
tokio = { version = "1", features = ["rt-multi-thread", "macros", "sync"] }
|
||||||
toml = "1.0.6"
|
toml = "1.0.7"
|
||||||
uuid = { version = "1.22.0", features = ["v4", "serde"] }
|
uuid = { version = "1.22.0", features = ["v4", "serde"] }
|
||||||
tokio-tungstenite = "0.29.0"
|
tokio-tungstenite = "0.29.0"
|
||||||
walkdir = "2.5.0"
|
walkdir = "2.5.0"
|
||||||
|
|||||||
@@ -128,8 +128,7 @@ export function subscribeAgentStream(
|
|||||||
onEvent: (event: AgentEvent) => void,
|
onEvent: (event: AgentEvent) => void,
|
||||||
onError?: (error: Event) => void,
|
onError?: (error: Event) => void,
|
||||||
): () => void {
|
): () => void {
|
||||||
const host = import.meta.env.DEV ? "http://127.0.0.1:3001" : "";
|
const url = `/agents/${encodeURIComponent(storyId)}/${encodeURIComponent(agentName)}/stream`;
|
||||||
const url = `${host}/agents/${encodeURIComponent(storyId)}/${encodeURIComponent(agentName)}/stream`;
|
|
||||||
|
|
||||||
const eventSource = new EventSource(url);
|
const eventSource = new EventSource(url);
|
||||||
|
|
||||||
|
|||||||
@@ -33,6 +33,8 @@ export interface PipelineStageItem {
|
|||||||
error: string | null;
|
error: string | null;
|
||||||
merge_failure: string | null;
|
merge_failure: string | null;
|
||||||
agent: AgentAssignment | null;
|
agent: AgentAssignment | null;
|
||||||
|
review_hold: boolean | null;
|
||||||
|
manual_qa: boolean | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface PipelineState {
|
export interface PipelineState {
|
||||||
@@ -312,8 +314,42 @@ export const api = {
|
|||||||
baseUrl,
|
baseUrl,
|
||||||
);
|
);
|
||||||
},
|
},
|
||||||
|
/** Approve a story in QA, moving it to merge. */
|
||||||
|
approveQa(storyId: string) {
|
||||||
|
return callMcpTool("approve_qa", { story_id: storyId });
|
||||||
|
},
|
||||||
|
/** Reject a story in QA, moving it back to current with notes. */
|
||||||
|
rejectQa(storyId: string, notes: string) {
|
||||||
|
return callMcpTool("reject_qa", { story_id: storyId, notes });
|
||||||
|
},
|
||||||
|
/** Launch the QA app for a story's worktree. */
|
||||||
|
launchQaApp(storyId: string) {
|
||||||
|
return callMcpTool("launch_qa_app", { story_id: storyId });
|
||||||
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
|
async function callMcpTool(
|
||||||
|
toolName: string,
|
||||||
|
args: Record<string, unknown>,
|
||||||
|
): Promise<string> {
|
||||||
|
const res = await fetch("/mcp", {
|
||||||
|
method: "POST",
|
||||||
|
headers: { "Content-Type": "application/json" },
|
||||||
|
body: JSON.stringify({
|
||||||
|
jsonrpc: "2.0",
|
||||||
|
id: 1,
|
||||||
|
method: "tools/call",
|
||||||
|
params: { name: toolName, arguments: args },
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
const json = await res.json();
|
||||||
|
if (json.error) {
|
||||||
|
throw new Error(json.error.message);
|
||||||
|
}
|
||||||
|
const text = json.result?.content?.[0]?.text ?? "";
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
export class ChatWebSocket {
|
export class ChatWebSocket {
|
||||||
private static sharedSocket: WebSocket | null = null;
|
private static sharedSocket: WebSocket | null = null;
|
||||||
private static refCount = 0;
|
private static refCount = 0;
|
||||||
|
|||||||
@@ -27,6 +27,8 @@ interface WorkItemDetailPanelProps {
|
|||||||
storyId: string;
|
storyId: string;
|
||||||
pipelineVersion: number;
|
pipelineVersion: number;
|
||||||
onClose: () => void;
|
onClose: () => void;
|
||||||
|
/** True when the item is in QA and awaiting human review. */
|
||||||
|
reviewHold?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
function TestCaseRow({ tc }: { tc: TestCaseResult }) {
|
function TestCaseRow({ tc }: { tc: TestCaseResult }) {
|
||||||
@@ -109,6 +111,7 @@ export function WorkItemDetailPanel({
|
|||||||
storyId,
|
storyId,
|
||||||
pipelineVersion,
|
pipelineVersion,
|
||||||
onClose,
|
onClose,
|
||||||
|
reviewHold: _reviewHold,
|
||||||
}: WorkItemDetailPanelProps) {
|
}: WorkItemDetailPanelProps) {
|
||||||
const [content, setContent] = useState<string | null>(null);
|
const [content, setContent] = useState<string | null>(null);
|
||||||
const [stage, setStage] = useState<string>("");
|
const [stage, setStage] = useState<string>("");
|
||||||
|
|||||||
@@ -23,6 +23,13 @@ export default defineConfig(() => {
|
|||||||
});
|
});
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
"/agents": {
|
||||||
|
target: `http://127.0.0.1:${String(backendPort)}`,
|
||||||
|
timeout: 120000,
|
||||||
|
configure: (proxy) => {
|
||||||
|
proxy.on("error", (_err) => {});
|
||||||
|
},
|
||||||
|
},
|
||||||
},
|
},
|
||||||
watch: {
|
watch: {
|
||||||
ignored: [
|
ignored: [
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
use std::path::{Path, PathBuf};
|
use std::path::{Path, PathBuf};
|
||||||
use std::process::Command;
|
use std::process::Command;
|
||||||
|
|
||||||
use crate::io::story_metadata::clear_front_matter_field;
|
use crate::io::story_metadata::{clear_front_matter_field, write_rejection_notes};
|
||||||
use crate::slog;
|
use crate::slog;
|
||||||
|
|
||||||
pub(super) fn item_type_from_id(item_id: &str) -> &'static str {
|
pub(super) fn item_type_from_id(item_id: &str) -> &'static str {
|
||||||
@@ -219,6 +219,52 @@ pub fn move_story_to_qa(project_root: &Path, story_id: &str) -> Result<(), Strin
|
|||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Move a story from `work/3_qa/` back to `work/2_current/` and write rejection notes.
|
||||||
|
///
|
||||||
|
/// Used when a human reviewer rejects a story during manual QA.
|
||||||
|
/// Clears the `review_hold` front matter field and appends rejection notes to the story file.
|
||||||
|
pub fn reject_story_from_qa(
|
||||||
|
project_root: &Path,
|
||||||
|
story_id: &str,
|
||||||
|
notes: &str,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
let sk = project_root.join(".story_kit").join("work");
|
||||||
|
let qa_path = sk.join("3_qa").join(format!("{story_id}.md"));
|
||||||
|
let current_dir = sk.join("2_current");
|
||||||
|
let current_path = current_dir.join(format!("{story_id}.md"));
|
||||||
|
|
||||||
|
if current_path.exists() {
|
||||||
|
return Ok(()); // Already in 2_current — idempotent.
|
||||||
|
}
|
||||||
|
|
||||||
|
if !qa_path.exists() {
|
||||||
|
return Err(format!(
|
||||||
|
"Work item '{story_id}' not found in work/3_qa/. Cannot reject."
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
std::fs::create_dir_all(¤t_dir)
|
||||||
|
.map_err(|e| format!("Failed to create work/2_current/ directory: {e}"))?;
|
||||||
|
std::fs::rename(&qa_path, ¤t_path)
|
||||||
|
.map_err(|e| format!("Failed to move '{story_id}' from 3_qa/ to 2_current/: {e}"))?;
|
||||||
|
|
||||||
|
// Clear review_hold since the story is going back for rework.
|
||||||
|
if let Err(e) = clear_front_matter_field(¤t_path, "review_hold") {
|
||||||
|
slog!("[lifecycle] Warning: could not clear review_hold from '{story_id}': {e}");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write rejection notes into the story file so the coder can see what needs fixing.
|
||||||
|
if !notes.is_empty()
|
||||||
|
&& let Err(e) = write_rejection_notes(¤t_path, notes)
|
||||||
|
{
|
||||||
|
slog!("[lifecycle] Warning: could not write rejection notes to '{story_id}': {e}");
|
||||||
|
}
|
||||||
|
|
||||||
|
slog!("[lifecycle] Rejected '{story_id}' from work/3_qa/ back to work/2_current/");
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
/// Move a bug from `work/2_current/` or `work/1_backlog/` to `work/5_done/` and auto-commit.
|
/// Move a bug from `work/2_current/` or `work/1_backlog/` to `work/5_done/` and auto-commit.
|
||||||
///
|
///
|
||||||
/// * If the bug is in `2_current/`, it is moved to `5_done/` and committed.
|
/// * If the bug is in `2_current/`, it is moved to `5_done/` and committed.
|
||||||
@@ -552,4 +598,51 @@ mod tests {
|
|||||||
"should return false when no feature branch"
|
"should return false when no feature branch"
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ── reject_story_from_qa tests ────────────────────────────────────────────
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn reject_story_from_qa_moves_to_current() {
|
||||||
|
use std::fs;
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let root = tmp.path();
|
||||||
|
let qa_dir = root.join(".story_kit/work/3_qa");
|
||||||
|
let current_dir = root.join(".story_kit/work/2_current");
|
||||||
|
fs::create_dir_all(&qa_dir).unwrap();
|
||||||
|
fs::create_dir_all(¤t_dir).unwrap();
|
||||||
|
fs::write(
|
||||||
|
qa_dir.join("50_story_test.md"),
|
||||||
|
"---\nname: Test\nreview_hold: true\n---\n# Story\n",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
reject_story_from_qa(root, "50_story_test", "Button color wrong").unwrap();
|
||||||
|
|
||||||
|
assert!(!qa_dir.join("50_story_test.md").exists());
|
||||||
|
assert!(current_dir.join("50_story_test.md").exists());
|
||||||
|
let contents = fs::read_to_string(current_dir.join("50_story_test.md")).unwrap();
|
||||||
|
assert!(contents.contains("Button color wrong"));
|
||||||
|
assert!(contents.contains("## QA Rejection Notes"));
|
||||||
|
assert!(!contents.contains("review_hold"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn reject_story_from_qa_errors_when_not_in_qa() {
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let result = reject_story_from_qa(tmp.path(), "99_nonexistent", "notes");
|
||||||
|
assert!(result.unwrap_err().contains("not found in work/3_qa/"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn reject_story_from_qa_idempotent_when_in_current() {
|
||||||
|
use std::fs;
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let root = tmp.path();
|
||||||
|
let current_dir = root.join(".story_kit/work/2_current");
|
||||||
|
fs::create_dir_all(¤t_dir).unwrap();
|
||||||
|
fs::write(current_dir.join("51_story_test.md"), "---\nname: Test\n---\n# Story\n").unwrap();
|
||||||
|
|
||||||
|
reject_story_from_qa(root, "51_story_test", "notes").unwrap();
|
||||||
|
assert!(current_dir.join("51_story_test.md").exists());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ use serde::Serialize;
|
|||||||
|
|
||||||
pub use lifecycle::{
|
pub use lifecycle::{
|
||||||
close_bug_to_archive, feature_branch_has_unmerged_changes, move_story_to_archived,
|
close_bug_to_archive, feature_branch_has_unmerged_changes, move_story_to_archived,
|
||||||
move_story_to_merge, move_story_to_qa,
|
move_story_to_merge, move_story_to_qa, reject_story_from_qa,
|
||||||
};
|
};
|
||||||
pub use pool::AgentPool;
|
pub use pool::AgentPool;
|
||||||
|
|
||||||
|
|||||||
@@ -889,25 +889,37 @@ impl AgentPool {
|
|||||||
};
|
};
|
||||||
|
|
||||||
if coverage_passed {
|
if coverage_passed {
|
||||||
// Spikes skip merge — they stay in 3_qa/ for human review.
|
// Check whether this item needs human review before merging.
|
||||||
if super::lifecycle::item_type_from_id(story_id) == "spike" {
|
let needs_human_review = {
|
||||||
// Mark the spike as held for review so auto-assign won't
|
let item_type = super::lifecycle::item_type_from_id(story_id);
|
||||||
// restart QA on it.
|
if item_type == "spike" {
|
||||||
|
true // Spikes always need human review.
|
||||||
|
} else {
|
||||||
|
// Stories/bugs: check the manual_qa front matter field (defaults to true).
|
||||||
let qa_dir = project_root.join(".story_kit/work/3_qa");
|
let qa_dir = project_root.join(".story_kit/work/3_qa");
|
||||||
let spike_path = qa_dir.join(format!("{story_id}.md"));
|
let story_path = qa_dir.join(format!("{story_id}.md"));
|
||||||
if let Err(e) = crate::io::story_metadata::write_review_hold(&spike_path) {
|
crate::io::story_metadata::requires_manual_qa(&story_path)
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
if needs_human_review {
|
||||||
|
// Hold in 3_qa/ for human review.
|
||||||
|
let qa_dir = project_root.join(".story_kit/work/3_qa");
|
||||||
|
let story_path = qa_dir.join(format!("{story_id}.md"));
|
||||||
|
if let Err(e) = crate::io::story_metadata::write_review_hold(&story_path) {
|
||||||
slog_error!("[pipeline] Failed to set review_hold on '{story_id}': {e}");
|
slog_error!("[pipeline] Failed to set review_hold on '{story_id}': {e}");
|
||||||
}
|
}
|
||||||
slog!(
|
slog!(
|
||||||
"[pipeline] QA passed for spike '{story_id}'. \
|
"[pipeline] QA passed for '{story_id}'. \
|
||||||
Stopping for human review (skipping merge). \
|
Holding for human review. \
|
||||||
Worktree preserved at: {worktree_path:?}"
|
Worktree preserved at: {worktree_path:?}"
|
||||||
);
|
);
|
||||||
// Free up the QA slot without advancing the spike.
|
// Free up the QA slot without advancing.
|
||||||
self.auto_assign_available_work(&project_root).await;
|
self.auto_assign_available_work(&project_root).await;
|
||||||
} else {
|
} else {
|
||||||
slog!(
|
slog!(
|
||||||
"[pipeline] QA passed gates and coverage for '{story_id}'. Moving to merge."
|
"[pipeline] QA passed gates and coverage for '{story_id}'. \
|
||||||
|
manual_qa: false — moving directly to merge."
|
||||||
);
|
);
|
||||||
if let Err(e) = super::lifecycle::move_story_to_merge(&project_root, story_id) {
|
if let Err(e) = super::lifecycle::move_story_to_merge(&project_root, story_id) {
|
||||||
slog_error!("[pipeline] Failed to move '{story_id}' to 4_merge/: {e}");
|
slog_error!("[pipeline] Failed to move '{story_id}' to 4_merge/: {e}");
|
||||||
@@ -1746,23 +1758,35 @@ impl AgentPool {
|
|||||||
};
|
};
|
||||||
|
|
||||||
if coverage_passed {
|
if coverage_passed {
|
||||||
// Spikes skip the merge stage — stay in 3_qa/ for human review.
|
// Check whether this item needs human review before merging.
|
||||||
if super::lifecycle::item_type_from_id(story_id) == "spike" {
|
let needs_human_review = {
|
||||||
let spike_path = project_root
|
let item_type = super::lifecycle::item_type_from_id(story_id);
|
||||||
|
if item_type == "spike" {
|
||||||
|
true
|
||||||
|
} else {
|
||||||
|
let story_path = project_root
|
||||||
.join(".story_kit/work/3_qa")
|
.join(".story_kit/work/3_qa")
|
||||||
.join(format!("{story_id}.md"));
|
.join(format!("{story_id}.md"));
|
||||||
if let Err(e) = crate::io::story_metadata::write_review_hold(&spike_path) {
|
crate::io::story_metadata::requires_manual_qa(&story_path)
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
if needs_human_review {
|
||||||
|
let story_path = project_root
|
||||||
|
.join(".story_kit/work/3_qa")
|
||||||
|
.join(format!("{story_id}.md"));
|
||||||
|
if let Err(e) = crate::io::story_metadata::write_review_hold(&story_path) {
|
||||||
eprintln!(
|
eprintln!(
|
||||||
"[startup:reconcile] Failed to set review_hold on spike '{story_id}': {e}"
|
"[startup:reconcile] Failed to set review_hold on '{story_id}': {e}"
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
eprintln!(
|
eprintln!(
|
||||||
"[startup:reconcile] Spike '{story_id}' passed QA — holding for human review."
|
"[startup:reconcile] '{story_id}' passed QA — holding for human review."
|
||||||
);
|
);
|
||||||
let _ = progress_tx.send(ReconciliationEvent {
|
let _ = progress_tx.send(ReconciliationEvent {
|
||||||
story_id: story_id.clone(),
|
story_id: story_id.clone(),
|
||||||
status: "review_hold".to_string(),
|
status: "review_hold".to_string(),
|
||||||
message: "Spike passed QA — waiting for human review.".to_string(),
|
message: "Passed QA — waiting for human review.".to_string(),
|
||||||
});
|
});
|
||||||
} else if let Err(e) = super::lifecycle::move_story_to_merge(project_root, story_id) {
|
} else if let Err(e) = super::lifecycle::move_story_to_merge(project_root, story_id) {
|
||||||
eprintln!(
|
eprintln!(
|
||||||
@@ -2655,7 +2679,12 @@ mod tests {
|
|||||||
// Set up story in 3_qa/
|
// Set up story in 3_qa/
|
||||||
let qa_dir = root.join(".story_kit/work/3_qa");
|
let qa_dir = root.join(".story_kit/work/3_qa");
|
||||||
fs::create_dir_all(&qa_dir).unwrap();
|
fs::create_dir_all(&qa_dir).unwrap();
|
||||||
fs::write(qa_dir.join("51_story_test.md"), "test").unwrap();
|
// manual_qa: false so the story skips human review and goes straight to merge.
|
||||||
|
fs::write(
|
||||||
|
qa_dir.join("51_story_test.md"),
|
||||||
|
"---\nname: Test\nmanual_qa: false\n---\ntest",
|
||||||
|
)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
let pool = AgentPool::new_test(3001);
|
let pool = AgentPool::new_test(3001);
|
||||||
pool.run_pipeline_advance(
|
pool.run_pipeline_advance(
|
||||||
|
|||||||
@@ -49,6 +49,9 @@ pub struct AppContext {
|
|||||||
/// Receiver for permission requests. The active WebSocket handler locks
|
/// Receiver for permission requests. The active WebSocket handler locks
|
||||||
/// this and polls for incoming permission forwards.
|
/// this and polls for incoming permission forwards.
|
||||||
pub perm_rx: Arc<tokio::sync::Mutex<mpsc::UnboundedReceiver<PermissionForward>>>,
|
pub perm_rx: Arc<tokio::sync::Mutex<mpsc::UnboundedReceiver<PermissionForward>>>,
|
||||||
|
/// Child process of the QA app launched for manual testing.
|
||||||
|
/// Only one instance runs at a time.
|
||||||
|
pub qa_app_process: Arc<std::sync::Mutex<Option<std::process::Child>>>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(test)]
|
#[cfg(test)]
|
||||||
@@ -69,6 +72,7 @@ impl AppContext {
|
|||||||
reconciliation_tx,
|
reconciliation_tx,
|
||||||
perm_tx,
|
perm_tx,
|
||||||
perm_rx: Arc::new(tokio::sync::Mutex::new(perm_rx)),
|
perm_rx: Arc::new(tokio::sync::Mutex::new(perm_rx)),
|
||||||
|
qa_app_process: Arc::new(std::sync::Mutex::new(None)),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
use crate::agents::{close_bug_to_archive, feature_branch_has_unmerged_changes, move_story_to_archived, move_story_to_merge, move_story_to_qa, AgentStatus, PipelineStage};
|
use crate::agents::{close_bug_to_archive, feature_branch_has_unmerged_changes, move_story_to_archived, move_story_to_merge, move_story_to_qa, reject_story_from_qa, AgentStatus, PipelineStage};
|
||||||
use crate::config::ProjectConfig;
|
use crate::config::ProjectConfig;
|
||||||
use crate::log_buffer;
|
use crate::log_buffer;
|
||||||
use crate::slog;
|
use crate::slog;
|
||||||
@@ -862,6 +862,52 @@ fn handle_tools_list(id: Option<Value>) -> JsonRpcResponse {
|
|||||||
"required": ["story_id"]
|
"required": ["story_id"]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"name": "approve_qa",
|
||||||
|
"description": "Approve a story that passed machine QA and is awaiting human review. Moves the story from work/3_qa/ to work/4_merge/ and starts the mergemaster agent.",
|
||||||
|
"inputSchema": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"story_id": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Story identifier (e.g. '247_story_human_qa_gate')"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": ["story_id"]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "reject_qa",
|
||||||
|
"description": "Reject a story during human QA review. Moves the story from work/3_qa/ back to work/2_current/ with rejection notes so the coder agent can fix the issues.",
|
||||||
|
"inputSchema": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"story_id": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Story identifier (e.g. '247_story_human_qa_gate')"
|
||||||
|
},
|
||||||
|
"notes": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Explanation of what is broken or needs fixing"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": ["story_id", "notes"]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "launch_qa_app",
|
||||||
|
"description": "Launch the app from a story's worktree for manual QA testing. Automatically assigns a free port, writes it to .story_kit_port, and starts the backend server. Only one QA app instance runs at a time.",
|
||||||
|
"inputSchema": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"story_id": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Story identifier whose worktree app to launch"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": ["story_id"]
|
||||||
|
}
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"name": "get_pipeline_status",
|
"name": "get_pipeline_status",
|
||||||
"description": "Return a structured snapshot of the full work item pipeline. Includes all active stages (current, qa, merge, done) with each item's stage, name, and assigned agent. Also includes upcoming backlog items.",
|
"description": "Return a structured snapshot of the full work item pipeline. Includes all active stages (current, qa, merge, done) with each item's stage, name, and assigned agent. Also includes upcoming backlog items.",
|
||||||
@@ -979,6 +1025,9 @@ async fn handle_tools_call(
|
|||||||
"report_merge_failure" => tool_report_merge_failure(&args, ctx),
|
"report_merge_failure" => tool_report_merge_failure(&args, ctx),
|
||||||
// QA tools
|
// QA tools
|
||||||
"request_qa" => tool_request_qa(&args, ctx).await,
|
"request_qa" => tool_request_qa(&args, ctx).await,
|
||||||
|
"approve_qa" => tool_approve_qa(&args, ctx).await,
|
||||||
|
"reject_qa" => tool_reject_qa(&args, ctx).await,
|
||||||
|
"launch_qa_app" => tool_launch_qa_app(&args, ctx).await,
|
||||||
// Pipeline status
|
// Pipeline status
|
||||||
"get_pipeline_status" => tool_get_pipeline_status(ctx),
|
"get_pipeline_status" => tool_get_pipeline_status(ctx),
|
||||||
// Diagnostics
|
// Diagnostics
|
||||||
@@ -1947,6 +1996,159 @@ async fn tool_request_qa(args: &Value, ctx: &AppContext) -> Result<String, Strin
|
|||||||
.map_err(|e| format!("Serialization error: {e}"))
|
.map_err(|e| format!("Serialization error: {e}"))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async fn tool_approve_qa(args: &Value, ctx: &AppContext) -> Result<String, String> {
|
||||||
|
let story_id = args
|
||||||
|
.get("story_id")
|
||||||
|
.and_then(|v| v.as_str())
|
||||||
|
.ok_or("Missing required argument: story_id")?;
|
||||||
|
|
||||||
|
let project_root = ctx.agents.get_project_root(&ctx.state)?;
|
||||||
|
|
||||||
|
// Clear review_hold before moving
|
||||||
|
let qa_path = project_root
|
||||||
|
.join(".story_kit/work/3_qa")
|
||||||
|
.join(format!("{story_id}.md"));
|
||||||
|
if qa_path.exists() {
|
||||||
|
let _ = crate::io::story_metadata::clear_front_matter_field(&qa_path, "review_hold");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Move story from work/3_qa/ to work/4_merge/
|
||||||
|
move_story_to_merge(&project_root, story_id)?;
|
||||||
|
|
||||||
|
// Start the mergemaster agent
|
||||||
|
let info = ctx
|
||||||
|
.agents
|
||||||
|
.start_agent(&project_root, story_id, Some("mergemaster"), None)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
serde_json::to_string_pretty(&json!({
|
||||||
|
"story_id": info.story_id,
|
||||||
|
"agent_name": info.agent_name,
|
||||||
|
"status": info.status.to_string(),
|
||||||
|
"message": format!(
|
||||||
|
"Story '{story_id}' approved. Moved to work/4_merge/ and mergemaster agent '{}' started.",
|
||||||
|
info.agent_name
|
||||||
|
),
|
||||||
|
}))
|
||||||
|
.map_err(|e| format!("Serialization error: {e}"))
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn tool_reject_qa(args: &Value, ctx: &AppContext) -> Result<String, String> {
|
||||||
|
let story_id = args
|
||||||
|
.get("story_id")
|
||||||
|
.and_then(|v| v.as_str())
|
||||||
|
.ok_or("Missing required argument: story_id")?;
|
||||||
|
let notes = args
|
||||||
|
.get("notes")
|
||||||
|
.and_then(|v| v.as_str())
|
||||||
|
.ok_or("Missing required argument: notes")?;
|
||||||
|
|
||||||
|
let project_root = ctx.agents.get_project_root(&ctx.state)?;
|
||||||
|
|
||||||
|
// Move story from work/3_qa/ back to work/2_current/ with rejection notes
|
||||||
|
reject_story_from_qa(&project_root, story_id, notes)?;
|
||||||
|
|
||||||
|
// Restart the coder agent with rejection context
|
||||||
|
let story_path = project_root
|
||||||
|
.join(".story_kit/work/2_current")
|
||||||
|
.join(format!("{story_id}.md"));
|
||||||
|
let agent_name = if story_path.exists() {
|
||||||
|
let contents = std::fs::read_to_string(&story_path).unwrap_or_default();
|
||||||
|
crate::io::story_metadata::parse_front_matter(&contents)
|
||||||
|
.ok()
|
||||||
|
.and_then(|meta| meta.agent)
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
};
|
||||||
|
let agent_name = agent_name.as_deref().unwrap_or("coder-opus");
|
||||||
|
|
||||||
|
let context = format!(
|
||||||
|
"\n\n---\n## QA Rejection\n\
|
||||||
|
Your previous implementation was rejected during human QA review.\n\
|
||||||
|
Rejection notes:\n{notes}\n\n\
|
||||||
|
Please fix the issues described above and try again."
|
||||||
|
);
|
||||||
|
if let Err(e) = ctx
|
||||||
|
.agents
|
||||||
|
.start_agent(&project_root, story_id, Some(agent_name), Some(&context))
|
||||||
|
.await
|
||||||
|
{
|
||||||
|
slog_warn!("[qa] Failed to restart coder for '{story_id}' after rejection: {e}");
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(format!(
|
||||||
|
"Story '{story_id}' rejected and moved back to work/2_current/. Coder agent '{agent_name}' restarted with rejection notes."
|
||||||
|
))
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn tool_launch_qa_app(args: &Value, ctx: &AppContext) -> Result<String, String> {
|
||||||
|
let story_id = args
|
||||||
|
.get("story_id")
|
||||||
|
.and_then(|v| v.as_str())
|
||||||
|
.ok_or("Missing required argument: story_id")?;
|
||||||
|
|
||||||
|
let project_root = ctx.agents.get_project_root(&ctx.state)?;
|
||||||
|
|
||||||
|
// Find the worktree path for this story
|
||||||
|
let worktrees = crate::worktree::list_worktrees(&project_root)?;
|
||||||
|
let wt = worktrees
|
||||||
|
.iter()
|
||||||
|
.find(|w| w.story_id == story_id)
|
||||||
|
.ok_or_else(|| format!("No worktree found for story '{story_id}'"))?;
|
||||||
|
let wt_path = wt.path.clone();
|
||||||
|
|
||||||
|
// Stop any existing QA app instance
|
||||||
|
{
|
||||||
|
let mut guard = ctx.qa_app_process.lock().unwrap();
|
||||||
|
if let Some(mut child) = guard.take() {
|
||||||
|
let _ = child.kill();
|
||||||
|
let _ = child.wait();
|
||||||
|
slog!("[qa-app] Stopped previous QA app instance.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find a free port starting from 3100
|
||||||
|
let port = find_free_port(3100);
|
||||||
|
|
||||||
|
// Write .story_kit_port so the frontend dev server knows where to connect
|
||||||
|
let port_file = wt_path.join(".story_kit_port");
|
||||||
|
std::fs::write(&port_file, port.to_string())
|
||||||
|
.map_err(|e| format!("Failed to write .story_kit_port: {e}"))?;
|
||||||
|
|
||||||
|
// Launch the server from the worktree
|
||||||
|
let child = std::process::Command::new("cargo")
|
||||||
|
.args(["run"])
|
||||||
|
.env("STORYKIT_PORT", port.to_string())
|
||||||
|
.current_dir(&wt_path)
|
||||||
|
.stdout(std::process::Stdio::null())
|
||||||
|
.stderr(std::process::Stdio::null())
|
||||||
|
.spawn()
|
||||||
|
.map_err(|e| format!("Failed to launch QA app: {e}"))?;
|
||||||
|
|
||||||
|
{
|
||||||
|
let mut guard = ctx.qa_app_process.lock().unwrap();
|
||||||
|
*guard = Some(child);
|
||||||
|
}
|
||||||
|
|
||||||
|
serde_json::to_string_pretty(&json!({
|
||||||
|
"story_id": story_id,
|
||||||
|
"port": port,
|
||||||
|
"worktree_path": wt_path.to_string_lossy(),
|
||||||
|
"message": format!("QA app launched on port {port} from worktree at {}", wt_path.display()),
|
||||||
|
}))
|
||||||
|
.map_err(|e| format!("Serialization error: {e}"))
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Find a free TCP port starting from `start`.
|
||||||
|
fn find_free_port(start: u16) -> u16 {
|
||||||
|
for port in start..start + 100 {
|
||||||
|
if std::net::TcpListener::bind(("127.0.0.1", port)).is_ok() {
|
||||||
|
return port;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
start // fallback
|
||||||
|
}
|
||||||
|
|
||||||
/// Run `git log <base>..HEAD --oneline` in the worktree and return the commit
|
/// Run `git log <base>..HEAD --oneline` in the worktree and return the commit
|
||||||
/// summaries, or `None` if git is unavailable or there are no new commits.
|
/// summaries, or `None` if git is unavailable or there are no new commits.
|
||||||
async fn get_worktree_commits(worktree_path: &str, base_branch: &str) -> Option<Vec<String>> {
|
async fn get_worktree_commits(worktree_path: &str, base_branch: &str) -> Option<Vec<String>> {
|
||||||
@@ -2383,11 +2585,14 @@ mod tests {
|
|||||||
assert!(names.contains(&"move_story_to_merge"));
|
assert!(names.contains(&"move_story_to_merge"));
|
||||||
assert!(names.contains(&"report_merge_failure"));
|
assert!(names.contains(&"report_merge_failure"));
|
||||||
assert!(names.contains(&"request_qa"));
|
assert!(names.contains(&"request_qa"));
|
||||||
|
assert!(names.contains(&"approve_qa"));
|
||||||
|
assert!(names.contains(&"reject_qa"));
|
||||||
|
assert!(names.contains(&"launch_qa_app"));
|
||||||
assert!(names.contains(&"get_server_logs"));
|
assert!(names.contains(&"get_server_logs"));
|
||||||
assert!(names.contains(&"prompt_permission"));
|
assert!(names.contains(&"prompt_permission"));
|
||||||
assert!(names.contains(&"get_pipeline_status"));
|
assert!(names.contains(&"get_pipeline_status"));
|
||||||
assert!(names.contains(&"rebuild_and_restart"));
|
assert!(names.contains(&"rebuild_and_restart"));
|
||||||
assert_eq!(tools.len(), 36);
|
assert_eq!(tools.len(), 39);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
@@ -3934,6 +4139,80 @@ stage = "coder"
|
|||||||
assert!(!req_names.contains(&"agent_name"));
|
assert!(!req_names.contains(&"agent_name"));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ── approve_qa in tools list ──────────────────────────────────
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn approve_qa_in_tools_list() {
|
||||||
|
let resp = handle_tools_list(Some(json!(1)));
|
||||||
|
let tools = resp.result.unwrap()["tools"].as_array().unwrap().clone();
|
||||||
|
let tool = tools.iter().find(|t| t["name"] == "approve_qa");
|
||||||
|
assert!(tool.is_some(), "approve_qa missing from tools list");
|
||||||
|
let t = tool.unwrap();
|
||||||
|
let required = t["inputSchema"]["required"].as_array().unwrap();
|
||||||
|
let req_names: Vec<&str> = required.iter().map(|v| v.as_str().unwrap()).collect();
|
||||||
|
assert!(req_names.contains(&"story_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── reject_qa in tools list ──────────────────────────────────
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn reject_qa_in_tools_list() {
|
||||||
|
let resp = handle_tools_list(Some(json!(1)));
|
||||||
|
let tools = resp.result.unwrap()["tools"].as_array().unwrap().clone();
|
||||||
|
let tool = tools.iter().find(|t| t["name"] == "reject_qa");
|
||||||
|
assert!(tool.is_some(), "reject_qa missing from tools list");
|
||||||
|
let t = tool.unwrap();
|
||||||
|
let required = t["inputSchema"]["required"].as_array().unwrap();
|
||||||
|
let req_names: Vec<&str> = required.iter().map(|v| v.as_str().unwrap()).collect();
|
||||||
|
assert!(req_names.contains(&"story_id"));
|
||||||
|
assert!(req_names.contains(&"notes"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── launch_qa_app in tools list ──────────────────────────────
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn launch_qa_app_in_tools_list() {
|
||||||
|
let resp = handle_tools_list(Some(json!(1)));
|
||||||
|
let tools = resp.result.unwrap()["tools"].as_array().unwrap().clone();
|
||||||
|
let tool = tools.iter().find(|t| t["name"] == "launch_qa_app");
|
||||||
|
assert!(tool.is_some(), "launch_qa_app missing from tools list");
|
||||||
|
let t = tool.unwrap();
|
||||||
|
let required = t["inputSchema"]["required"].as_array().unwrap();
|
||||||
|
let req_names: Vec<&str> = required.iter().map(|v| v.as_str().unwrap()).collect();
|
||||||
|
assert!(req_names.contains(&"story_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── approve_qa missing story_id ──────────────────────────────
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn tool_approve_qa_missing_story_id() {
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let ctx = test_ctx(tmp.path());
|
||||||
|
let result = tool_approve_qa(&json!({}), &ctx).await;
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert!(result.unwrap_err().contains("story_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── reject_qa missing arguments ──────────────────────────────
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn tool_reject_qa_missing_story_id() {
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let ctx = test_ctx(tmp.path());
|
||||||
|
let result = tool_reject_qa(&json!({"notes": "broken"}), &ctx).await;
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert!(result.unwrap_err().contains("story_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn tool_reject_qa_missing_notes() {
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let ctx = test_ctx(tmp.path());
|
||||||
|
let result = tool_reject_qa(&json!({"story_id": "1_story_test"}), &ctx).await;
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert!(result.unwrap_err().contains("notes"));
|
||||||
|
}
|
||||||
|
|
||||||
// ── tool_validate_stories with file content ───────────────────
|
// ── tool_validate_stories with file content ───────────────────
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
|
|||||||
@@ -24,6 +24,12 @@ pub struct UpcomingStory {
|
|||||||
pub merge_failure: Option<String>,
|
pub merge_failure: Option<String>,
|
||||||
/// Active agent working on this item, if any.
|
/// Active agent working on this item, if any.
|
||||||
pub agent: Option<AgentAssignment>,
|
pub agent: Option<AgentAssignment>,
|
||||||
|
/// True when the item is held in QA for human review.
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub review_hold: Option<bool>,
|
||||||
|
/// Whether the item requires manual QA (defaults to true when absent).
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub manual_qa: Option<bool>,
|
||||||
}
|
}
|
||||||
|
|
||||||
pub struct StoryValidationResult {
|
pub struct StoryValidationResult {
|
||||||
@@ -117,12 +123,12 @@ fn load_stage_items(
|
|||||||
.to_string();
|
.to_string();
|
||||||
let contents = fs::read_to_string(&path)
|
let contents = fs::read_to_string(&path)
|
||||||
.map_err(|e| format!("Failed to read story file {}: {e}", path.display()))?;
|
.map_err(|e| format!("Failed to read story file {}: {e}", path.display()))?;
|
||||||
let (name, error, merge_failure) = match parse_front_matter(&contents) {
|
let (name, error, merge_failure, review_hold, manual_qa) = match parse_front_matter(&contents) {
|
||||||
Ok(meta) => (meta.name, None, meta.merge_failure),
|
Ok(meta) => (meta.name, None, meta.merge_failure, meta.review_hold, meta.manual_qa),
|
||||||
Err(e) => (None, Some(e.to_string()), None),
|
Err(e) => (None, Some(e.to_string()), None, None, None),
|
||||||
};
|
};
|
||||||
let agent = agent_map.get(&story_id).cloned();
|
let agent = agent_map.get(&story_id).cloned();
|
||||||
stories.push(UpcomingStory { story_id, name, error, merge_failure, agent });
|
stories.push(UpcomingStory { story_id, name, error, merge_failure, agent, review_hold, manual_qa });
|
||||||
}
|
}
|
||||||
|
|
||||||
stories.sort_by(|a, b| a.story_id.cmp(&b.story_id));
|
stories.sort_by(|a, b| a.story_id.cmp(&b.story_id));
|
||||||
|
|||||||
@@ -693,6 +693,8 @@ mod tests {
|
|||||||
error: None,
|
error: None,
|
||||||
merge_failure: None,
|
merge_failure: None,
|
||||||
agent: None,
|
agent: None,
|
||||||
|
review_hold: None,
|
||||||
|
manual_qa: None,
|
||||||
};
|
};
|
||||||
let resp = WsResponse::PipelineState {
|
let resp = WsResponse::PipelineState {
|
||||||
backlog: vec![story],
|
backlog: vec![story],
|
||||||
@@ -830,6 +832,8 @@ mod tests {
|
|||||||
error: None,
|
error: None,
|
||||||
merge_failure: None,
|
merge_failure: None,
|
||||||
agent: None,
|
agent: None,
|
||||||
|
review_hold: None,
|
||||||
|
manual_qa: None,
|
||||||
}],
|
}],
|
||||||
current: vec![UpcomingStory {
|
current: vec![UpcomingStory {
|
||||||
story_id: "2_story_b".to_string(),
|
story_id: "2_story_b".to_string(),
|
||||||
@@ -837,6 +841,8 @@ mod tests {
|
|||||||
error: None,
|
error: None,
|
||||||
merge_failure: None,
|
merge_failure: None,
|
||||||
agent: None,
|
agent: None,
|
||||||
|
review_hold: None,
|
||||||
|
manual_qa: None,
|
||||||
}],
|
}],
|
||||||
qa: vec![],
|
qa: vec![],
|
||||||
merge: vec![],
|
merge: vec![],
|
||||||
@@ -846,6 +852,8 @@ mod tests {
|
|||||||
error: None,
|
error: None,
|
||||||
merge_failure: None,
|
merge_failure: None,
|
||||||
agent: None,
|
agent: None,
|
||||||
|
review_hold: None,
|
||||||
|
manual_qa: None,
|
||||||
}],
|
}],
|
||||||
};
|
};
|
||||||
let resp: WsResponse = state.into();
|
let resp: WsResponse = state.into();
|
||||||
@@ -1002,6 +1010,8 @@ mod tests {
|
|||||||
model: Some("claude-3-5-sonnet".to_string()),
|
model: Some("claude-3-5-sonnet".to_string()),
|
||||||
status: "running".to_string(),
|
status: "running".to_string(),
|
||||||
}),
|
}),
|
||||||
|
review_hold: None,
|
||||||
|
manual_qa: None,
|
||||||
}],
|
}],
|
||||||
qa: vec![],
|
qa: vec![],
|
||||||
merge: vec![],
|
merge: vec![],
|
||||||
|
|||||||
@@ -9,6 +9,7 @@ pub struct StoryMetadata {
|
|||||||
pub merge_failure: Option<String>,
|
pub merge_failure: Option<String>,
|
||||||
pub agent: Option<String>,
|
pub agent: Option<String>,
|
||||||
pub review_hold: Option<bool>,
|
pub review_hold: Option<bool>,
|
||||||
|
pub manual_qa: Option<bool>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||||
@@ -33,6 +34,7 @@ struct FrontMatter {
|
|||||||
merge_failure: Option<String>,
|
merge_failure: Option<String>,
|
||||||
agent: Option<String>,
|
agent: Option<String>,
|
||||||
review_hold: Option<bool>,
|
review_hold: Option<bool>,
|
||||||
|
manual_qa: Option<bool>,
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn parse_front_matter(contents: &str) -> Result<StoryMetadata, StoryMetaError> {
|
pub fn parse_front_matter(contents: &str) -> Result<StoryMetadata, StoryMetaError> {
|
||||||
@@ -67,6 +69,7 @@ fn build_metadata(front: FrontMatter) -> StoryMetadata {
|
|||||||
merge_failure: front.merge_failure,
|
merge_failure: front.merge_failure,
|
||||||
agent: front.agent,
|
agent: front.agent,
|
||||||
review_hold: front.review_hold,
|
review_hold: front.review_hold,
|
||||||
|
manual_qa: front.manual_qa,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -193,6 +196,32 @@ pub fn set_front_matter_field(contents: &str, key: &str, value: &str) -> String
|
|||||||
result
|
result
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Append rejection notes to a story file body.
|
||||||
|
///
|
||||||
|
/// Adds a `## QA Rejection Notes` section at the end of the file so the coder
|
||||||
|
/// agent can see what needs fixing.
|
||||||
|
pub fn write_rejection_notes(path: &Path, notes: &str) -> Result<(), String> {
|
||||||
|
let contents =
|
||||||
|
fs::read_to_string(path).map_err(|e| format!("Failed to read story file: {e}"))?;
|
||||||
|
|
||||||
|
let section = format!("\n\n## QA Rejection Notes\n\n{notes}\n");
|
||||||
|
let updated = format!("{contents}{section}");
|
||||||
|
fs::write(path, &updated).map_err(|e| format!("Failed to write story file: {e}"))?;
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Check whether a story requires manual QA (defaults to true).
|
||||||
|
pub fn requires_manual_qa(path: &Path) -> bool {
|
||||||
|
let contents = match fs::read_to_string(path) {
|
||||||
|
Ok(c) => c,
|
||||||
|
Err(_) => return true,
|
||||||
|
};
|
||||||
|
match parse_front_matter(&contents) {
|
||||||
|
Ok(meta) => meta.manual_qa.unwrap_or(true),
|
||||||
|
Err(_) => true,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
pub fn parse_unchecked_todos(contents: &str) -> Vec<String> {
|
pub fn parse_unchecked_todos(contents: &str) -> Vec<String> {
|
||||||
contents
|
contents
|
||||||
.lines()
|
.lines()
|
||||||
@@ -367,4 +396,45 @@ workflow: tdd
|
|||||||
assert!(contents.contains("review_hold: true"));
|
assert!(contents.contains("review_hold: true"));
|
||||||
assert!(contents.contains("name: My Spike"));
|
assert!(contents.contains("name: My Spike"));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn parses_manual_qa_from_front_matter() {
|
||||||
|
let input = "---\nname: Story\nmanual_qa: false\n---\n# Story\n";
|
||||||
|
let meta = parse_front_matter(input).expect("front matter");
|
||||||
|
assert_eq!(meta.manual_qa, Some(false));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn manual_qa_defaults_to_none() {
|
||||||
|
let input = "---\nname: Story\n---\n# Story\n";
|
||||||
|
let meta = parse_front_matter(input).expect("front matter");
|
||||||
|
assert_eq!(meta.manual_qa, None);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn requires_manual_qa_defaults_true() {
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let path = tmp.path().join("story.md");
|
||||||
|
std::fs::write(&path, "---\nname: Test\n---\n# Story\n").unwrap();
|
||||||
|
assert!(requires_manual_qa(&path));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn requires_manual_qa_false_when_set() {
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let path = tmp.path().join("story.md");
|
||||||
|
std::fs::write(&path, "---\nname: Test\nmanual_qa: false\n---\n# Story\n").unwrap();
|
||||||
|
assert!(!requires_manual_qa(&path));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn write_rejection_notes_appends_section() {
|
||||||
|
let tmp = tempfile::tempdir().unwrap();
|
||||||
|
let path = tmp.path().join("story.md");
|
||||||
|
std::fs::write(&path, "---\nname: Test\n---\n# Story\n").unwrap();
|
||||||
|
write_rejection_notes(&path, "Button color is wrong").unwrap();
|
||||||
|
let contents = std::fs::read_to_string(&path).unwrap();
|
||||||
|
assert!(contents.contains("## QA Rejection Notes"));
|
||||||
|
assert!(contents.contains("Button color is wrong"));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -188,6 +188,7 @@ async fn main() -> Result<(), std::io::Error> {
|
|||||||
reconciliation_tx,
|
reconciliation_tx,
|
||||||
perm_tx,
|
perm_tx,
|
||||||
perm_rx,
|
perm_rx,
|
||||||
|
qa_app_process: Arc::new(std::sync::Mutex::new(None)),
|
||||||
};
|
};
|
||||||
|
|
||||||
let app = build_routes(ctx);
|
let app = build_routes(ctx);
|
||||||
|
|||||||
@@ -103,11 +103,7 @@ pub fn load_history(project_root: &std::path::Path) -> HashMap<OwnedRoomId, Room
|
|||||||
persisted
|
persisted
|
||||||
.rooms
|
.rooms
|
||||||
.into_iter()
|
.into_iter()
|
||||||
.filter_map(|(k, v)| {
|
.filter_map(|(k, v)| k.parse::<OwnedRoomId>().ok().map(|room_id| (room_id, v)))
|
||||||
k.parse::<OwnedRoomId>()
|
|
||||||
.ok()
|
|
||||||
.map(|room_id| (room_id, v))
|
|
||||||
})
|
|
||||||
.collect()
|
.collect()
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -237,10 +233,7 @@ fn read_stage_items(
|
|||||||
project_root: &std::path::Path,
|
project_root: &std::path::Path,
|
||||||
stage_dir: &str,
|
stage_dir: &str,
|
||||||
) -> Vec<(String, Option<String>)> {
|
) -> Vec<(String, Option<String>)> {
|
||||||
let dir = project_root
|
let dir = project_root.join(".story_kit").join("work").join(stage_dir);
|
||||||
.join(".story_kit")
|
|
||||||
.join("work")
|
|
||||||
.join(stage_dir);
|
|
||||||
if !dir.exists() {
|
if !dir.exists() {
|
||||||
return Vec::new();
|
return Vec::new();
|
||||||
}
|
}
|
||||||
@@ -252,9 +245,7 @@ fn read_stage_items(
|
|||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
if let Some(stem) = path.file_stem().and_then(|s| s.to_str()) {
|
if let Some(stem) = path.file_stem().and_then(|s| s.to_str()) {
|
||||||
let name = std::fs::read_to_string(&path)
|
let name = std::fs::read_to_string(&path).ok().and_then(|contents| {
|
||||||
.ok()
|
|
||||||
.and_then(|contents| {
|
|
||||||
crate::io::story_metadata::parse_front_matter(&contents)
|
crate::io::story_metadata::parse_front_matter(&contents)
|
||||||
.ok()
|
.ok()
|
||||||
.and_then(|m| m.name)
|
.and_then(|m| m.name)
|
||||||
@@ -282,7 +273,7 @@ pub fn build_pipeline_status(project_root: &std::path::Path, agents: &AgentPool)
|
|||||||
let mut out = String::from("**Pipeline Status**\n\n");
|
let mut out = String::from("**Pipeline Status**\n\n");
|
||||||
|
|
||||||
let stages = [
|
let stages = [
|
||||||
("1_upcoming", "Upcoming"),
|
("1_backlog", "Backlog"),
|
||||||
("2_current", "In Progress"),
|
("2_current", "In Progress"),
|
||||||
("3_qa", "QA"),
|
("3_qa", "QA"),
|
||||||
("4_merge", "Merge"),
|
("4_merge", "Merge"),
|
||||||
@@ -408,7 +399,10 @@ pub async fn run_bot(
|
|||||||
.ok_or_else(|| "No user ID after login".to_string())?
|
.ok_or_else(|| "No user ID after login".to_string())?
|
||||||
.to_owned();
|
.to_owned();
|
||||||
|
|
||||||
slog!("[matrix-bot] Logged in as {bot_user_id} (device: {})", login_response.device_id);
|
slog!(
|
||||||
|
"[matrix-bot] Logged in as {bot_user_id} (device: {})",
|
||||||
|
login_response.device_id
|
||||||
|
);
|
||||||
|
|
||||||
// Bootstrap cross-signing keys for E2EE verification support.
|
// Bootstrap cross-signing keys for E2EE verification support.
|
||||||
// Pass the bot's password for UIA (User-Interactive Authentication) —
|
// Pass the bot's password for UIA (User-Interactive Authentication) —
|
||||||
@@ -540,7 +534,9 @@ pub async fn run_bot(
|
|||||||
agents,
|
agents,
|
||||||
};
|
};
|
||||||
|
|
||||||
slog!("[matrix-bot] Cryptographic identity verification is always ON — commands from unencrypted rooms or unverified devices are rejected");
|
slog!(
|
||||||
|
"[matrix-bot] Cryptographic identity verification is always ON — commands from unencrypted rooms or unverified devices are rejected"
|
||||||
|
);
|
||||||
|
|
||||||
// Register event handlers and inject shared context.
|
// Register event handlers and inject shared context.
|
||||||
client.add_event_handler_context(ctx);
|
client.add_event_handler_context(ctx);
|
||||||
@@ -679,9 +675,7 @@ pub fn parse_ambient_command(
|
|||||||
// Strip a leading @mention (handles "@localpart" and "@localpart:server").
|
// Strip a leading @mention (handles "@localpart" and "@localpart:server").
|
||||||
let rest = if let Some(after_at) = lower.strip_prefix('@') {
|
let rest = if let Some(after_at) = lower.strip_prefix('@') {
|
||||||
// Skip everything up to the first whitespace (the full mention token).
|
// Skip everything up to the first whitespace (the full mention token).
|
||||||
let word_end = after_at
|
let word_end = after_at.find(char::is_whitespace).unwrap_or(after_at.len());
|
||||||
.find(char::is_whitespace)
|
|
||||||
.unwrap_or(after_at.len());
|
|
||||||
after_at[word_end..].trim()
|
after_at[word_end..].trim()
|
||||||
} else if let Some(after) = lower.strip_prefix(display_lower.as_str()) {
|
} else if let Some(after) = lower.strip_prefix(display_lower.as_str()) {
|
||||||
after.trim()
|
after.trim()
|
||||||
@@ -734,10 +728,7 @@ async fn is_reply_to_bot(
|
|||||||
/// is the correct trust model: a user is accepted when they have cross-signing
|
/// is the correct trust model: a user is accepted when they have cross-signing
|
||||||
/// configured, regardless of whether the bot has run an explicit verification
|
/// configured, regardless of whether the bot has run an explicit verification
|
||||||
/// ceremony with a specific device.
|
/// ceremony with a specific device.
|
||||||
async fn check_sender_verified(
|
async fn check_sender_verified(client: &Client, sender: &OwnedUserId) -> Result<bool, String> {
|
||||||
client: &Client,
|
|
||||||
sender: &OwnedUserId,
|
|
||||||
) -> Result<bool, String> {
|
|
||||||
let identity = client
|
let identity = client
|
||||||
.encryption()
|
.encryption()
|
||||||
.get_user_identity(sender)
|
.get_user_identity(sender)
|
||||||
@@ -803,8 +794,9 @@ async fn on_to_device_verification_request(
|
|||||||
}
|
}
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
VerificationRequestState::Done
|
VerificationRequestState::Done | VerificationRequestState::Cancelled(_) => {
|
||||||
| VerificationRequestState::Cancelled(_) => break,
|
break;
|
||||||
|
}
|
||||||
_ => {}
|
_ => {}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1022,11 +1014,9 @@ async fn on_room_message(
|
|||||||
slog!("[matrix-bot] Message from {sender}: {user_message}");
|
slog!("[matrix-bot] Message from {sender}: {user_message}");
|
||||||
|
|
||||||
// Check for bot-level commands (e.g. "help") before invoking the LLM.
|
// Check for bot-level commands (e.g. "help") before invoking the LLM.
|
||||||
if let Some(response) = super::commands::try_handle_command(
|
if let Some(response) =
|
||||||
&ctx.bot_name,
|
super::commands::try_handle_command(&ctx.bot_name, ctx.bot_user_id.as_str(), &user_message)
|
||||||
ctx.bot_user_id.as_str(),
|
{
|
||||||
&user_message,
|
|
||||||
) {
|
|
||||||
slog!("[matrix-bot] Handled bot command from {sender}");
|
slog!("[matrix-bot] Handled bot command from {sender}");
|
||||||
let html = markdown_to_html(&response);
|
let html = markdown_to_html(&response);
|
||||||
if let Ok(resp) = room
|
if let Ok(resp) = room
|
||||||
@@ -1083,9 +1073,7 @@ async fn handle_message(
|
|||||||
// flattening history into a text prefix.
|
// flattening history into a text prefix.
|
||||||
let resume_session_id: Option<String> = {
|
let resume_session_id: Option<String> = {
|
||||||
let guard = ctx.history.lock().await;
|
let guard = ctx.history.lock().await;
|
||||||
guard
|
guard.get(&room_id).and_then(|conv| conv.session_id.clone())
|
||||||
.get(&room_id)
|
|
||||||
.and_then(|conv| conv.session_id.clone())
|
|
||||||
};
|
};
|
||||||
|
|
||||||
// The prompt is just the current message with sender attribution.
|
// The prompt is just the current message with sender attribution.
|
||||||
@@ -1260,7 +1248,11 @@ async fn handle_message(
|
|||||||
let conv = guard.entry(room_id).or_default();
|
let conv = guard.entry(room_id).or_default();
|
||||||
|
|
||||||
// Store the session ID so the next turn uses --resume.
|
// Store the session ID so the next turn uses --resume.
|
||||||
slog!("[matrix-bot] storing session_id: {:?} (was: {:?})", new_session_id, conv.session_id);
|
slog!(
|
||||||
|
"[matrix-bot] storing session_id: {:?} (was: {:?})",
|
||||||
|
new_session_id,
|
||||||
|
conv.session_id
|
||||||
|
);
|
||||||
if new_session_id.is_some() {
|
if new_session_id.is_some() {
|
||||||
conv.session_id = new_session_id;
|
conv.session_id = new_session_id;
|
||||||
}
|
}
|
||||||
@@ -2017,7 +2009,10 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn startup_announcement_uses_configured_display_name_not_hardcoded() {
|
fn startup_announcement_uses_configured_display_name_not_hardcoded() {
|
||||||
assert_eq!(format_startup_announcement("HAL"), "HAL is online.");
|
assert_eq!(format_startup_announcement("HAL"), "HAL is online.");
|
||||||
assert_eq!(format_startup_announcement("Assistant"), "Assistant is online.");
|
assert_eq!(
|
||||||
|
format_startup_announcement("Assistant"),
|
||||||
|
"Assistant is online."
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
// -- extract_command (status trigger) ------------------------------------
|
// -- extract_command (status trigger) ------------------------------------
|
||||||
@@ -2072,7 +2067,7 @@ mod tests {
|
|||||||
let pool = AgentPool::new_test(3001);
|
let pool = AgentPool::new_test(3001);
|
||||||
let out = build_pipeline_status(dir.path(), &pool);
|
let out = build_pipeline_status(dir.path(), &pool);
|
||||||
|
|
||||||
assert!(out.contains("Upcoming"), "missing Upcoming: {out}");
|
assert!(out.contains("Backlog"), "missing Backlog: {out}");
|
||||||
assert!(out.contains("In Progress"), "missing In Progress: {out}");
|
assert!(out.contains("In Progress"), "missing In Progress: {out}");
|
||||||
assert!(out.contains("QA"), "missing QA: {out}");
|
assert!(out.contains("QA"), "missing QA: {out}");
|
||||||
assert!(out.contains("Merge"), "missing Merge: {out}");
|
assert!(out.contains("Merge"), "missing Merge: {out}");
|
||||||
@@ -2084,7 +2079,7 @@ mod tests {
|
|||||||
let dir = tempfile::tempdir().unwrap();
|
let dir = tempfile::tempdir().unwrap();
|
||||||
write_story_file(
|
write_story_file(
|
||||||
dir.path(),
|
dir.path(),
|
||||||
"1_upcoming",
|
"1_backlog",
|
||||||
"42_story_do_something.md",
|
"42_story_do_something.md",
|
||||||
"Do Something",
|
"Do Something",
|
||||||
);
|
);
|
||||||
@@ -2104,7 +2099,10 @@ mod tests {
|
|||||||
let pool = AgentPool::new_test(3001);
|
let pool = AgentPool::new_test(3001);
|
||||||
let out = build_pipeline_status(dir.path(), &pool);
|
let out = build_pipeline_status(dir.path(), &pool);
|
||||||
|
|
||||||
assert!(out.contains("Free Agents"), "missing Free Agents section: {out}");
|
assert!(
|
||||||
|
out.contains("Free Agents"),
|
||||||
|
"missing Free Agents section: {out}"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
@@ -2114,8 +2112,11 @@ mod tests {
|
|||||||
let out = build_pipeline_status(dir.path(), &pool);
|
let out = build_pipeline_status(dir.path(), &pool);
|
||||||
|
|
||||||
// Stages and headers should use markdown bold (**text**).
|
// Stages and headers should use markdown bold (**text**).
|
||||||
assert!(out.contains("**Pipeline Status**"), "missing bold title: {out}");
|
assert!(
|
||||||
assert!(out.contains("**Upcoming**"), "stage should use bold: {out}");
|
out.contains("**Pipeline Status**"),
|
||||||
|
"missing bold title: {out}"
|
||||||
|
);
|
||||||
|
assert!(out.contains("**Backlog**"), "stage should use bold: {out}");
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
@@ -2157,13 +2158,19 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn ambient_command_on_with_at_mention() {
|
fn ambient_command_on_with_at_mention() {
|
||||||
let uid = make_user_id("@timmy:homeserver.local");
|
let uid = make_user_id("@timmy:homeserver.local");
|
||||||
assert_eq!(parse_ambient_command("@timmy ambient on", &uid, "Timmy"), Some(true));
|
assert_eq!(
|
||||||
|
parse_ambient_command("@timmy ambient on", &uid, "Timmy"),
|
||||||
|
Some(true)
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn ambient_command_off_with_at_mention() {
|
fn ambient_command_off_with_at_mention() {
|
||||||
let uid = make_user_id("@timmy:homeserver.local");
|
let uid = make_user_id("@timmy:homeserver.local");
|
||||||
assert_eq!(parse_ambient_command("@timmy ambient off", &uid, "Timmy"), Some(false));
|
assert_eq!(
|
||||||
|
parse_ambient_command("@timmy ambient off", &uid, "Timmy"),
|
||||||
|
Some(false)
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
@@ -2178,39 +2185,60 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn ambient_command_on_with_display_name() {
|
fn ambient_command_on_with_display_name() {
|
||||||
let uid = make_user_id("@timmy:homeserver.local");
|
let uid = make_user_id("@timmy:homeserver.local");
|
||||||
assert_eq!(parse_ambient_command("timmy ambient on", &uid, "Timmy"), Some(true));
|
assert_eq!(
|
||||||
|
parse_ambient_command("timmy ambient on", &uid, "Timmy"),
|
||||||
|
Some(true)
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn ambient_command_off_with_display_name() {
|
fn ambient_command_off_with_display_name() {
|
||||||
let uid = make_user_id("@timmy:homeserver.local");
|
let uid = make_user_id("@timmy:homeserver.local");
|
||||||
assert_eq!(parse_ambient_command("timmy ambient off", &uid, "Timmy"), Some(false));
|
assert_eq!(
|
||||||
|
parse_ambient_command("timmy ambient off", &uid, "Timmy"),
|
||||||
|
Some(false)
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn ambient_command_on_bare() {
|
fn ambient_command_on_bare() {
|
||||||
// "ambient on" without any bot mention is also recognised.
|
// "ambient on" without any bot mention is also recognised.
|
||||||
let uid = make_user_id("@timmy:homeserver.local");
|
let uid = make_user_id("@timmy:homeserver.local");
|
||||||
assert_eq!(parse_ambient_command("ambient on", &uid, "Timmy"), Some(true));
|
assert_eq!(
|
||||||
|
parse_ambient_command("ambient on", &uid, "Timmy"),
|
||||||
|
Some(true)
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn ambient_command_off_bare() {
|
fn ambient_command_off_bare() {
|
||||||
let uid = make_user_id("@timmy:homeserver.local");
|
let uid = make_user_id("@timmy:homeserver.local");
|
||||||
assert_eq!(parse_ambient_command("ambient off", &uid, "Timmy"), Some(false));
|
assert_eq!(
|
||||||
|
parse_ambient_command("ambient off", &uid, "Timmy"),
|
||||||
|
Some(false)
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn ambient_command_case_insensitive() {
|
fn ambient_command_case_insensitive() {
|
||||||
let uid = make_user_id("@timmy:homeserver.local");
|
let uid = make_user_id("@timmy:homeserver.local");
|
||||||
assert_eq!(parse_ambient_command("@Timmy AMBIENT ON", &uid, "Timmy"), Some(true));
|
assert_eq!(
|
||||||
assert_eq!(parse_ambient_command("TIMMY AMBIENT OFF", &uid, "Timmy"), Some(false));
|
parse_ambient_command("@Timmy AMBIENT ON", &uid, "Timmy"),
|
||||||
|
Some(true)
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
parse_ambient_command("TIMMY AMBIENT OFF", &uid, "Timmy"),
|
||||||
|
Some(false)
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn ambient_command_unrelated_message_returns_none() {
|
fn ambient_command_unrelated_message_returns_none() {
|
||||||
let uid = make_user_id("@timmy:homeserver.local");
|
let uid = make_user_id("@timmy:homeserver.local");
|
||||||
assert_eq!(parse_ambient_command("@timmy what is the status?", &uid, "Timmy"), None);
|
assert_eq!(
|
||||||
|
parse_ambient_command("@timmy what is the status?", &uid, "Timmy"),
|
||||||
|
None
|
||||||
|
);
|
||||||
assert_eq!(parse_ambient_command("hello there", &uid, "Timmy"), None);
|
assert_eq!(parse_ambient_command("hello there", &uid, "Timmy"), None);
|
||||||
assert_eq!(parse_ambient_command("ambient", &uid, "Timmy"), None);
|
assert_eq!(parse_ambient_command("ambient", &uid, "Timmy"), None);
|
||||||
}
|
}
|
||||||
@@ -2237,11 +2265,17 @@ mod tests {
|
|||||||
|
|
||||||
let guard = ambient_rooms.lock().await;
|
let guard = ambient_rooms.lock().await;
|
||||||
assert!(guard.contains(&room_a), "room_a should be in ambient mode");
|
assert!(guard.contains(&room_a), "room_a should be in ambient mode");
|
||||||
assert!(!guard.contains(&room_b), "room_b should NOT be in ambient mode");
|
assert!(
|
||||||
|
!guard.contains(&room_b),
|
||||||
|
"room_b should NOT be in ambient mode"
|
||||||
|
);
|
||||||
drop(guard);
|
drop(guard);
|
||||||
|
|
||||||
// Disable ambient mode for room_a.
|
// Disable ambient mode for room_a.
|
||||||
ambient_rooms.lock().await.remove(&room_a);
|
ambient_rooms.lock().await.remove(&room_a);
|
||||||
assert!(!ambient_rooms.lock().await.contains(&room_a), "room_a ambient mode should be off");
|
assert!(
|
||||||
|
!ambient_rooms.lock().await.contains(&room_a),
|
||||||
|
"room_a ambient mode should be off"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
Reference in New Issue
Block a user