11 Commits

Author SHA1 Message Date
Timmy
fcc2b9c3eb Bump version to 0.5.0 2026-03-23 13:11:57 +00:00
dave
0c4239501a storkit: done 370_bug_scaffold_does_not_create_mcp_json_in_project_root 2026-03-23 13:00:46 +00:00
dave
13b6ecd958 storkit: merge 370_bug_scaffold_does_not_create_mcp_json_in_project_root 2026-03-23 13:00:43 +00:00
dave
1816a94617 storkit: merge 369_bug_cli_treats_help_and_version_as_project_paths 2026-03-23 12:55:58 +00:00
dave
56d3373e69 Revert gVisor (runsc) from Docker setup
gVisor is incompatible with OrbStack bind mounts on macOS — writes to
/mnt/mac are blocked by the gVisor filesystem sandbox. Removing
runtime: runsc from docker-compose.yml, the gVisor setup docs from
README.md, and the runsc assertion test from rebuild.rs.

The existing Docker hardening (read-only root, cap_drop ALL,
no-new-privileges, resource limits) remains in place.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-23 12:53:10 +00:00
dave
efdb0c5814 storkit: create 370_bug_scaffold_does_not_create_mcp_json_in_project_root 2026-03-23 12:43:48 +00:00
dave
b8365275d8 storkit: create 370_bug_scaffold_does_not_create_mcp_json_in_project_root 2026-03-23 12:43:15 +00:00
dave
6ddfd29927 storkit: create 369_bug_cli_treats_help_and_version_as_project_paths 2026-03-23 12:43:04 +00:00
dave
01b157a2e4 storkit: create 369_bug_cli_treats_help_and_version_as_project_paths 2026-03-23 12:42:04 +00:00
dave
99a59d7ad1 storkit: create 369_bug_cli_treats_help_and_version_as_project_paths 2026-03-23 12:41:27 +00:00
dave
eb8adb6225 storkit: create 369_bug_cli_treats_help_and_version_as_project_paths 2026-03-23 12:39:15 +00:00
14 changed files with 406 additions and 152 deletions

View File

@@ -0,0 +1,34 @@
---
name: "CLI treats --help and --version as project paths"
---
# Bug 369: CLI treats --help and --version as project paths
## Description
When running `storkit <anything>`, the binary treats the first argument as a project path, creates a directory for it, and scaffolds `.storkit/` inside. This happens for `--help`, `--version`, `serve`, `x`, or any other string. There is no validation that the argument is an existing directory or a reasonable path before creating it.
## How to Reproduce
1. Run `storkit --help` or `storkit serve` or `storkit x` in any directory
2. Observe that a directory with that name is created with a full `.storkit/` scaffold inside it
## Actual Result
Any argument is treated as a project path and a directory is created and scaffolded. No flags are recognised.
## Expected Result
- `storkit --help` prints usage info and exits
- `storkit --version` prints the version and exits
- `storkit <path>` only works if the path already exists as a directory
- If the path does not exist, storkit prints a clear error and exits non-zero
## Acceptance Criteria
- [ ] storkit --help prints usage information and exits with code 0
- [ ] storkit --version prints the version and exits with code 0
- [ ] storkit -h and storkit -V work as short aliases
- [ ] storkit does not create directories for any argument — the path must already exist
- [ ] If the path does not exist, storkit prints a clear error and exits non-zero
- [ ] Arguments starting with - that are not recognised produce a clear error message

View File

@@ -0,0 +1,33 @@
---
name: "Scaffold does not create .mcp.json in project root"
---
# Bug 370: Scaffold does not create .mcp.json in project root
## Description
Two related problems with project setup:
1. When the user clicks the "project setup" button in the web UI to open a new project, the scaffold does not reliably run — the `.storkit/` directory and associated files may not be created.
2. Even when the scaffold does run, it does not write `.mcp.json` to the project root. Without this file, agents spawned in worktrees cannot find the MCP server, causing `--permission-prompt-tool mcp__storkit__prompt_permission not found` errors and agent failures.
## How to Reproduce
1. Open the storkit web UI and use the project setup button to open a new project directory
2. Check whether the full scaffold was created (`.storkit/`, `CLAUDE.md`, `script/test`, etc.)
3. Check the project root for `.mcp.json`
## Actual Result
The scaffold may not run when using the UI project setup flow. When it does run, `.mcp.json` is not created in the project root. Agents fail because MCP tools are unavailable.
## Expected Result
Clicking the project setup button reliably runs the full scaffold, including `.mcp.json` pointing to the server's port.
## Acceptance Criteria
- [ ] The web UI project setup button triggers the full scaffold for new projects
- [ ] scaffold_story_kit writes .mcp.json to the project root with the server's port
- [ ] Existing .mcp.json is not overwritten if already present
- [ ] .mcp.json is included in .gitignore since the port is environment-specific

42
Cargo.lock generated
View File

@@ -1774,9 +1774,9 @@ checksum = "d98f6fed1fde3f8c21bc40a1abb88dd75e67924f9cffc3ef95607bad8017f8e2"
[[package]] [[package]]
name = "iri-string" name = "iri-string"
version = "0.7.10" version = "0.7.11"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c91338f0783edbd6195decb37bae672fd3b165faffb89bf7b9e6942f8b1a731a" checksum = "d8e7418f59cc01c88316161279a7f665217ae316b388e58a0d10e29f54f1e5eb"
dependencies = [ dependencies = [
"memchr", "memchr",
"serde", "serde",
@@ -1815,7 +1815,7 @@ dependencies = [
"cesu8", "cesu8",
"cfg-if", "cfg-if",
"combine", "combine",
"jni-sys", "jni-sys 0.3.1",
"log", "log",
"thiserror 1.0.69", "thiserror 1.0.69",
"walkdir", "walkdir",
@@ -1824,9 +1824,31 @@ dependencies = [
[[package]] [[package]]
name = "jni-sys" name = "jni-sys"
version = "0.3.0" version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8eaf4bc02d17cbdd7ff4c7438cafcdf7fb9a4613313ad11b4f8fefe7d3fa0130" checksum = "41a652e1f9b6e0275df1f15b32661cf0d4b78d4d87ddec5e0c3c20f097433258"
dependencies = [
"jni-sys 0.4.1",
]
[[package]]
name = "jni-sys"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c6377a88cb3910bee9b0fa88d4f42e1d2da8e79915598f65fb0c7ee14c878af2"
dependencies = [
"jni-sys-macros",
]
[[package]]
name = "jni-sys-macros"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38c0b942f458fe50cdac086d2f946512305e5631e720728f2a61aabcd47a6264"
dependencies = [
"quote",
"syn 2.0.117",
]
[[package]] [[package]]
name = "jobserver" name = "jobserver"
@@ -2948,9 +2970,9 @@ dependencies = [
[[package]] [[package]]
name = "pulldown-cmark" name = "pulldown-cmark"
version = "0.13.1" version = "0.13.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "83c41efbf8f90ac44de7f3a868f0867851d261b56291732d0cbf7cceaaeb55a6" checksum = "7c3a14896dfa883796f1cb410461aef38810ea05f2b2c33c5aded3649095fdad"
dependencies = [ dependencies = [
"bitflags 2.11.0", "bitflags 2.11.0",
"memchr", "memchr",
@@ -3625,9 +3647,9 @@ checksum = "f87165f0995f63a9fbeea62b64d10b4d9d8e78ec6d7d51fb2125fda7bb36788f"
[[package]] [[package]]
name = "rustls-webpki" name = "rustls-webpki"
version = "0.103.9" version = "0.103.10"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d7df23109aa6c1567d1c575b9952556388da57401e4ace1d15f79eedad0d8f53" checksum = "df33b2b81ac578cabaf06b89b0631153a3f416b0a886e8a7a1707fb51abbd1ef"
dependencies = [ dependencies = [
"aws-lc-rs", "aws-lc-rs",
"ring", "ring",
@@ -3994,7 +4016,7 @@ checksum = "6ce2be8dc25455e1f91df71bfa12ad37d7af1092ae736f3a6cd0e37bc7810596"
[[package]] [[package]]
name = "storkit" name = "storkit"
version = "0.4.1" version = "0.5.0"
dependencies = [ dependencies = [
"async-stream", "async-stream",
"async-trait", "async-trait",

View File

@@ -77,64 +77,6 @@ ldd target/x86_64-unknown-linux-musl/release/storkit
./storkit ./storkit
``` ```
## Running in Docker (with gVisor sandboxing)
The `docker/docker-compose.yml` runs the container under [gVisor](https://gvisor.dev/)
(`runtime: runsc`). gVisor intercepts all container syscalls in userspace, providing an
extra layer of isolation so that even a compromised workload cannot make raw syscalls to
the host kernel.
### Host setup (Linux only)
gVisor is a Linux technology. On macOS (OrbStack, Docker Desktop) you must remove
`runtime: runsc` from `docker/docker-compose.yml` — gVisor is not available there.
**1. Install gVisor (Debian/Ubuntu):**
```bash
curl -fsSL https://gvisor.dev/archive.key | sudo gpg --dearmor -o /usr/share/keyrings/gvisor-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/gvisor-archive-keyring.gpg] \
https://storage.googleapis.com/gvisor/releases release main" \
| sudo tee /etc/apt/sources.list.d/gvisor.list
sudo apt-get update && sudo apt-get install -y runsc
```
**2. Register runsc with Docker (`/etc/docker/daemon.json`):**
```json
{
"runtimes": {
"runsc": { "path": "/usr/bin/runsc" }
}
}
```
**3. Restart Docker and verify:**
```bash
sudo systemctl restart docker
docker run --runtime=runsc hello-world
```
**4. Launch storkit:**
```bash
GIT_USER_NAME="Your Name" GIT_USER_EMAIL="you@example.com" \
PROJECT_PATH=/path/to/your/repo \
docker compose -f docker/docker-compose.yml up
```
### gVisor compatibility notes
The following storkit subsystems have been verified to work under `runsc`:
- **PTY-based agent spawning** (`portable_pty` / `openpty`) gVisor implements the
full POSIX PTY interface (`/dev/ptmx`, `TIOCGWINSZ`, etc.).
- **`rebuild_and_restart`** uses `execve()` to replace the server process, which
gVisor fully supports.
- **Rust compilation** `cargo build` inside the container invokes standard fork/exec
primitives, all of which gVisor implements.
## Releasing ## Releasing
Builds both macOS and Linux binaries locally, tags the repo, and publishes a Gitea release with a changelog. Builds both macOS and Linux binaries locally, tags the repo, and publishes a Gitea release with a changelog.

View File

@@ -8,39 +8,12 @@
# OrbStack users: just install OrbStack and use `docker compose` normally. # OrbStack users: just install OrbStack and use `docker compose` normally.
# OrbStack's VirtioFS bind mount driver is significantly faster than # OrbStack's VirtioFS bind mount driver is significantly faster than
# Docker Desktop's default (see spike findings). # Docker Desktop's default (see spike findings).
#
# ── gVisor (runsc) host setup ────────────────────────────────────────────
# This compose file uses `runtime: runsc` (gVisor) for syscall-level
# sandboxing. gVisor intercepts all container syscalls in userspace so
# that even if a malicious workload escapes the container's process
# namespace it cannot make raw syscalls to the host kernel.
#
# Prerequisites on the Docker host:
# 1. Install gVisor:
# curl -fsSL https://gvisor.dev/archive.key | sudo gpg --dearmor -o /usr/share/keyrings/gvisor-archive-keyring.gpg
# echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/gvisor-archive-keyring.gpg] https://storage.googleapis.com/gvisor/releases release main" | sudo tee /etc/apt/sources.list.d/gvisor.list
# sudo apt-get update && sudo apt-get install -y runsc
# 2. Register runsc with Docker (/etc/docker/daemon.json):
# {
# "runtimes": {
# "runsc": { "path": "/usr/bin/runsc" }
# }
# }
# 3. Restart Docker: sudo systemctl restart docker
# 4. Verify: docker run --runtime=runsc hello-world
#
# Note: On macOS (OrbStack / Docker Desktop) gVisor is Linux-only and
# not supported. Remove `runtime: runsc` for local development on macOS.
services: services:
storkit: storkit:
build: build:
context: .. context: ..
dockerfile: docker/Dockerfile dockerfile: docker/Dockerfile
# Run under gVisor for syscall-level sandboxing.
# Requires runsc installed and registered in /etc/docker/daemon.json.
# See host setup instructions in the header comment above.
runtime: runsc
container_name: storkit container_name: storkit
ports: ports:
# Bind to localhost only — not exposed on all interfaces. # Bind to localhost only — not exposed on all interfaces.

View File

@@ -1,12 +1,12 @@
{ {
"name": "living-spec-standalone", "name": "living-spec-standalone",
"version": "0.4.1", "version": "0.5.0",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "living-spec-standalone", "name": "living-spec-standalone",
"version": "0.4.1", "version": "0.5.0",
"dependencies": { "dependencies": {
"@types/react-syntax-highlighter": "^15.5.13", "@types/react-syntax-highlighter": "^15.5.13",
"react": "^19.1.0", "react": "^19.1.0",

View File

@@ -1,7 +1,7 @@
{ {
"name": "living-spec-standalone", "name": "living-spec-standalone",
"private": true, "private": true,
"version": "0.4.1", "version": "0.5.0",
"type": "module", "type": "module",
"scripts": { "scripts": {
"dev": "vite", "dev": "vite",

View File

@@ -147,9 +147,65 @@ else
| sed 's/^/- /') | sed 's/^/- /')
fi fi
# ── Generate summary overview ─────────────────────────────────
# Group completed items by keyword clusters to identify the
# release's focus areas.
generate_summary() {
local all_items="$1"
local themes=""
# Count items matching each theme keyword (one item per line via echo -e)
local expanded
expanded=$(echo -e "$all_items")
local bot_count=$(echo "$expanded" | grep -icE 'bot|command|chat|matrix|slack|whatsapp|status|help|assign|rebuild|shutdown|whatsup' || true)
local mcp_count=$(echo "$expanded" | grep -icE 'mcp|tool' || true)
local docker_count=$(echo "$expanded" | grep -icE 'docker|container|gvisor|orbstack|harden|security' || true)
local agent_count=$(echo "$expanded" | grep -icE 'agent|runtime|chatgpt|gemini|openai|model|coder' || true)
local ui_count=$(echo "$expanded" | grep -icE 'frontend|ui|web|oauth|scaffold' || true)
local infra_count=$(echo "$expanded" | grep -icE 'release|makefile|refactor|upgrade|worktree|pipeline' || true)
# Build theme list, highest count first
local -a theme_pairs=()
[ "$agent_count" -gt 0 ] && theme_pairs+=("${agent_count}:multi-model agents")
[ "$bot_count" -gt 0 ] && theme_pairs+=("${bot_count}:bot commands")
[ "$mcp_count" -gt 0 ] && theme_pairs+=("${mcp_count}:MCP tools")
[ "$docker_count" -gt 0 ] && theme_pairs+=("${docker_count}:Docker hardening")
[ "$ui_count" -gt 0 ] && theme_pairs+=("${ui_count}:developer experience")
[ "$infra_count" -gt 0 ] && theme_pairs+=("${infra_count}:infrastructure")
# Sort by count descending, take top 3
local sorted=$(printf '%s\n' "${theme_pairs[@]}" | sort -t: -k1 -nr | head -3)
local labels=""
while IFS=: read -r count label; do
[ -z "$label" ] && continue
if [ -z "$labels" ]; then
# Capitalise first theme
labels="$(echo "${label:0:1}" | tr '[:lower:]' '[:upper:]')${label:1}"
else
labels="${labels}, ${label}"
fi
done <<< "$sorted"
echo "$labels"
}
ALL_ITEMS="${FEATURES}${FIXES}${REFACTORS}"
SUMMARY=$(generate_summary "$ALL_ITEMS")
if [ -n "$SUMMARY" ]; then
SUMMARY_LINE="**Focus:** ${SUMMARY}"
else
SUMMARY_LINE=""
fi
# Assemble the release body. # Assemble the release body.
RELEASE_BODY="## What's Changed" RELEASE_BODY="## What's Changed"
if [ -n "$SUMMARY_LINE" ]; then
RELEASE_BODY="${RELEASE_BODY}
${SUMMARY_LINE}"
fi
if [ -n "$FEATURES" ]; then if [ -n "$FEATURES" ]; then
RELEASE_BODY="${RELEASE_BODY} RELEASE_BODY="${RELEASE_BODY}

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "storkit" name = "storkit"
version = "0.4.1" version = "0.5.0"
edition = "2024" edition = "2024"
build = "build.rs" build = "build.rs"

View File

@@ -144,6 +144,10 @@ impl AgentPool {
} }
} }
pub fn port(&self) -> u16 {
self.port
}
/// Create a pool with a dummy watcher channel for unit tests. /// Create a pool with a dummy watcher channel for unit tests.
#[cfg(test)] #[cfg(test)]
pub fn new_test(port: u16) -> Self { pub fn new_test(port: u16) -> Self {

View File

@@ -39,6 +39,7 @@ impl ProjectApi {
payload.0.path, payload.0.path,
&self.ctx.state, &self.ctx.state,
self.ctx.store.as_ref(), self.ctx.store.as_ref(),
self.ctx.agents.port(),
) )
.await .await
.map_err(bad_request)?; .map_err(bad_request)?;

View File

@@ -369,7 +369,7 @@ fn write_story_kit_gitignore(root: &Path) -> Result<(), String> {
/// the project root and git does not support `../` patterns in `.gitignore` /// the project root and git does not support `../` patterns in `.gitignore`
/// files, so they cannot be expressed in `.storkit/.gitignore`. /// files, so they cannot be expressed in `.storkit/.gitignore`.
fn append_root_gitignore_entries(root: &Path) -> Result<(), String> { fn append_root_gitignore_entries(root: &Path) -> Result<(), String> {
let entries = [".storkit_port", "store.json"]; let entries = [".storkit_port", "store.json", ".mcp.json"];
let gitignore_path = root.join(".gitignore"); let gitignore_path = root.join(".gitignore");
let existing = if gitignore_path.exists() { let existing = if gitignore_path.exists() {
@@ -404,7 +404,7 @@ fn append_root_gitignore_entries(root: &Path) -> Result<(), String> {
Ok(()) Ok(())
} }
fn scaffold_story_kit(root: &Path) -> Result<(), String> { fn scaffold_story_kit(root: &Path, port: u16) -> Result<(), String> {
let story_kit_root = root.join(".storkit"); let story_kit_root = root.join(".storkit");
let specs_root = story_kit_root.join("specs"); let specs_root = story_kit_root.join("specs");
let tech_root = specs_root.join("tech"); let tech_root = specs_root.join("tech");
@@ -440,6 +440,14 @@ fn scaffold_story_kit(root: &Path) -> Result<(), String> {
write_script_if_missing(&script_root.join("test"), STORY_KIT_SCRIPT_TEST)?; write_script_if_missing(&script_root.join("test"), STORY_KIT_SCRIPT_TEST)?;
write_file_if_missing(&root.join("CLAUDE.md"), STORY_KIT_CLAUDE_MD)?; write_file_if_missing(&root.join("CLAUDE.md"), STORY_KIT_CLAUDE_MD)?;
// Write .mcp.json at the project root so agents can find the MCP server.
// Only written when missing — never overwrites an existing file, because
// the port is environment-specific and must not clobber a running instance.
let mcp_content = format!(
"{{\n \"mcpServers\": {{\n \"storkit\": {{\n \"type\": \"http\",\n \"url\": \"http://localhost:{port}/mcp\"\n }}\n }}\n}}\n"
);
write_file_if_missing(&root.join(".mcp.json"), &mcp_content)?;
// Create .claude/settings.json with sensible permission defaults so that // Create .claude/settings.json with sensible permission defaults so that
// Claude Code (both agents and web UI chat) can operate without constant // Claude Code (both agents and web UI chat) can operate without constant
// permission prompts. // permission prompts.
@@ -505,14 +513,14 @@ fn scaffold_story_kit(root: &Path) -> Result<(), String> {
Ok(()) Ok(())
} }
async fn ensure_project_root_with_story_kit(path: PathBuf) -> Result<(), String> { async fn ensure_project_root_with_story_kit(path: PathBuf, port: u16) -> Result<(), String> {
tokio::task::spawn_blocking(move || { tokio::task::spawn_blocking(move || {
if !path.exists() { if !path.exists() {
fs::create_dir_all(&path) fs::create_dir_all(&path)
.map_err(|e| format!("Failed to create project directory: {}", e))?; .map_err(|e| format!("Failed to create project directory: {}", e))?;
} }
if !path.join(".storkit").is_dir() { if !path.join(".storkit").is_dir() {
scaffold_story_kit(&path)?; scaffold_story_kit(&path, port)?;
} }
Ok(()) Ok(())
}) })
@@ -524,10 +532,11 @@ pub async fn open_project(
path: String, path: String,
state: &SessionState, state: &SessionState,
store: &dyn StoreOps, store: &dyn StoreOps,
port: u16,
) -> Result<String, String> { ) -> Result<String, String> {
let p = PathBuf::from(&path); let p = PathBuf::from(&path);
ensure_project_root_with_story_kit(p.clone()).await?; ensure_project_root_with_story_kit(p.clone(), port).await?;
validate_project_path(p.clone()).await?; validate_project_path(p.clone()).await?;
{ {
@@ -816,7 +825,7 @@ mod tests {
let store = make_store(&dir); let store = make_store(&dir);
let state = SessionState::default(); let state = SessionState::default();
let result = open_project(project_dir.to_string_lossy().to_string(), &state, &store).await; let result = open_project(project_dir.to_string_lossy().to_string(), &state, &store, 3001).await;
assert!(result.is_ok()); assert!(result.is_ok());
let root = state.get_project_root().unwrap(); let root = state.get_project_root().unwrap();
@@ -824,26 +833,47 @@ mod tests {
} }
#[tokio::test] #[tokio::test]
async fn open_project_does_not_write_mcp_json() { async fn open_project_does_not_overwrite_existing_mcp_json() {
// open_project must NOT overwrite .mcp.json — test servers started by QA // scaffold must NOT overwrite .mcp.json when it already exists — QA
// agents share the real project root, so writing here would clobber the // test servers share the real project root, and re-writing would
// root .mcp.json with the wrong port. .mcp.json is written once during // clobber the file with the wrong port.
// worktree creation (worktree.rs) and should not be touched again. let dir = tempdir().unwrap();
let project_dir = dir.path().join("myproject");
fs::create_dir_all(&project_dir).unwrap();
// Pre-write .mcp.json with a different port to simulate an already-configured project.
let mcp_path = project_dir.join(".mcp.json");
fs::write(&mcp_path, "{\"existing\": true}").unwrap();
let store = make_store(&dir);
let state = SessionState::default();
open_project(project_dir.to_string_lossy().to_string(), &state, &store, 3001)
.await
.unwrap();
assert_eq!(
fs::read_to_string(&mcp_path).unwrap(),
"{\"existing\": true}",
"open_project must not overwrite an existing .mcp.json"
);
}
#[tokio::test]
async fn open_project_writes_mcp_json_when_missing() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
let project_dir = dir.path().join("myproject"); let project_dir = dir.path().join("myproject");
fs::create_dir_all(&project_dir).unwrap(); fs::create_dir_all(&project_dir).unwrap();
let store = make_store(&dir); let store = make_store(&dir);
let state = SessionState::default(); let state = SessionState::default();
open_project(project_dir.to_string_lossy().to_string(), &state, &store) open_project(project_dir.to_string_lossy().to_string(), &state, &store, 3001)
.await .await
.unwrap(); .unwrap();
let mcp_path = project_dir.join(".mcp.json"); let mcp_path = project_dir.join(".mcp.json");
assert!( assert!(mcp_path.exists(), "open_project should write .mcp.json for new projects");
!mcp_path.exists(), let content = fs::read_to_string(&mcp_path).unwrap();
"open_project must not write .mcp.json — that would overwrite the root with the wrong port" assert!(content.contains("3001"), "mcp.json should reference the server port");
); assert!(content.contains("localhost"), "mcp.json should reference localhost");
} }
#[tokio::test] #[tokio::test]
@@ -898,7 +928,7 @@ mod tests {
let store = make_store(&dir); let store = make_store(&dir);
let state = SessionState::default(); let state = SessionState::default();
open_project(project_dir.to_string_lossy().to_string(), &state, &store) open_project(project_dir.to_string_lossy().to_string(), &state, &store, 3001)
.await .await
.unwrap(); .unwrap();
@@ -1071,7 +1101,7 @@ mod tests {
#[test] #[test]
fn scaffold_story_kit_creates_structure() { fn scaffold_story_kit_creates_structure() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
assert!(dir.path().join(".storkit/README.md").exists()); assert!(dir.path().join(".storkit/README.md").exists());
assert!(dir.path().join(".storkit/project.toml").exists()); assert!(dir.path().join(".storkit/project.toml").exists());
@@ -1085,7 +1115,7 @@ mod tests {
#[test] #[test]
fn scaffold_story_kit_creates_work_pipeline_dirs() { fn scaffold_story_kit_creates_work_pipeline_dirs() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let stages = [ let stages = [
"1_backlog", "1_backlog",
@@ -1109,7 +1139,7 @@ mod tests {
#[test] #[test]
fn scaffold_story_kit_project_toml_has_coder_qa_mergemaster() { fn scaffold_story_kit_project_toml_has_coder_qa_mergemaster() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let content = fs::read_to_string(dir.path().join(".storkit/project.toml")).unwrap(); let content = fs::read_to_string(dir.path().join(".storkit/project.toml")).unwrap();
assert!(content.contains("[[agent]]")); assert!(content.contains("[[agent]]"));
@@ -1122,7 +1152,7 @@ mod tests {
#[test] #[test]
fn scaffold_context_is_blank_template_not_story_kit_content() { fn scaffold_context_is_blank_template_not_story_kit_content() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let content = fs::read_to_string(dir.path().join(".storkit/specs/00_CONTEXT.md")).unwrap(); let content = fs::read_to_string(dir.path().join(".storkit/specs/00_CONTEXT.md")).unwrap();
assert!(content.contains("<!-- storkit:scaffold-template -->")); assert!(content.contains("<!-- storkit:scaffold-template -->"));
@@ -1138,7 +1168,7 @@ mod tests {
#[test] #[test]
fn scaffold_stack_is_blank_template_not_story_kit_content() { fn scaffold_stack_is_blank_template_not_story_kit_content() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let content = fs::read_to_string(dir.path().join(".storkit/specs/tech/STACK.md")).unwrap(); let content = fs::read_to_string(dir.path().join(".storkit/specs/tech/STACK.md")).unwrap();
assert!(content.contains("<!-- storkit:scaffold-template -->")); assert!(content.contains("<!-- storkit:scaffold-template -->"));
@@ -1157,7 +1187,7 @@ mod tests {
use std::os::unix::fs::PermissionsExt; use std::os::unix::fs::PermissionsExt;
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let script_test = dir.path().join("script/test"); let script_test = dir.path().join("script/test");
assert!(script_test.exists(), "script/test should be created"); assert!(script_test.exists(), "script/test should be created");
@@ -1175,7 +1205,7 @@ mod tests {
fs::create_dir_all(readme.parent().unwrap()).unwrap(); fs::create_dir_all(readme.parent().unwrap()).unwrap();
fs::write(&readme, "custom content").unwrap(); fs::write(&readme, "custom content").unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
assert_eq!(fs::read_to_string(&readme).unwrap(), "custom content"); assert_eq!(fs::read_to_string(&readme).unwrap(), "custom content");
} }
@@ -1183,13 +1213,13 @@ mod tests {
#[test] #[test]
fn scaffold_story_kit_is_idempotent() { fn scaffold_story_kit_is_idempotent() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let readme_content = fs::read_to_string(dir.path().join(".storkit/README.md")).unwrap(); let readme_content = fs::read_to_string(dir.path().join(".storkit/README.md")).unwrap();
let toml_content = fs::read_to_string(dir.path().join(".storkit/project.toml")).unwrap(); let toml_content = fs::read_to_string(dir.path().join(".storkit/project.toml")).unwrap();
// Run again — must not change content or add duplicate .gitignore entries // Run again — must not change content or add duplicate .gitignore entries
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
assert_eq!( assert_eq!(
fs::read_to_string(dir.path().join(".storkit/README.md")).unwrap(), fs::read_to_string(dir.path().join(".storkit/README.md")).unwrap(),
@@ -1237,7 +1267,7 @@ mod tests {
.status() .status()
.unwrap(); .unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
// Only 1 commit should exist — scaffold must not commit into an existing repo // Only 1 commit should exist — scaffold must not commit into an existing repo
let log_output = std::process::Command::new("git") let log_output = std::process::Command::new("git")
@@ -1256,7 +1286,7 @@ mod tests {
#[test] #[test]
fn scaffold_creates_story_kit_gitignore_with_relative_entries() { fn scaffold_creates_story_kit_gitignore_with_relative_entries() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
// .storkit/.gitignore must contain relative patterns for files under .storkit/ // .storkit/.gitignore must contain relative patterns for files under .storkit/
let sk_content = fs::read_to_string(dir.path().join(".storkit/.gitignore")).unwrap(); let sk_content = fs::read_to_string(dir.path().join(".storkit/.gitignore")).unwrap();
@@ -1287,7 +1317,7 @@ mod tests {
) )
.unwrap(); .unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let content = fs::read_to_string(dir.path().join(".storkit/.gitignore")).unwrap(); let content = fs::read_to_string(dir.path().join(".storkit/.gitignore")).unwrap();
let worktrees_count = content.lines().filter(|l| l.trim() == "worktrees/").count(); let worktrees_count = content.lines().filter(|l| l.trim() == "worktrees/").count();
@@ -1303,7 +1333,7 @@ mod tests {
#[test] #[test]
fn scaffold_creates_claude_md_at_project_root() { fn scaffold_creates_claude_md_at_project_root() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let claude_md = dir.path().join("CLAUDE.md"); let claude_md = dir.path().join("CLAUDE.md");
assert!( assert!(
@@ -1332,7 +1362,7 @@ mod tests {
let claude_md = dir.path().join("CLAUDE.md"); let claude_md = dir.path().join("CLAUDE.md");
fs::write(&claude_md, "custom CLAUDE.md content").unwrap(); fs::write(&claude_md, "custom CLAUDE.md content").unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
assert_eq!( assert_eq!(
fs::read_to_string(&claude_md).unwrap(), fs::read_to_string(&claude_md).unwrap(),
@@ -1341,6 +1371,46 @@ mod tests {
); );
} }
#[test]
fn scaffold_story_kit_writes_mcp_json_with_port() {
let dir = tempdir().unwrap();
scaffold_story_kit(dir.path(), 4242).unwrap();
let mcp_path = dir.path().join(".mcp.json");
assert!(mcp_path.exists(), ".mcp.json should be created by scaffold");
let content = fs::read_to_string(&mcp_path).unwrap();
assert!(content.contains("4242"), ".mcp.json should reference the given port");
assert!(content.contains("localhost"), ".mcp.json should reference localhost");
assert!(content.contains("storkit"), ".mcp.json should name the storkit server");
}
#[test]
fn scaffold_story_kit_does_not_overwrite_existing_mcp_json() {
let dir = tempdir().unwrap();
let mcp_path = dir.path().join(".mcp.json");
fs::write(&mcp_path, "{\"custom\": true}").unwrap();
scaffold_story_kit(dir.path(), 3001).unwrap();
assert_eq!(
fs::read_to_string(&mcp_path).unwrap(),
"{\"custom\": true}",
"scaffold should not overwrite an existing .mcp.json"
);
}
#[test]
fn scaffold_gitignore_includes_mcp_json() {
let dir = tempdir().unwrap();
scaffold_story_kit(dir.path(), 3001).unwrap();
let root_gitignore = fs::read_to_string(dir.path().join(".gitignore")).unwrap();
assert!(
root_gitignore.contains(".mcp.json"),
"root .gitignore should include .mcp.json (port is environment-specific)"
);
}
// --- open_project scaffolding --- // --- open_project scaffolding ---
#[tokio::test] #[tokio::test]
@@ -1351,7 +1421,7 @@ mod tests {
let store = make_store(&dir); let store = make_store(&dir);
let state = SessionState::default(); let state = SessionState::default();
open_project(project_dir.to_string_lossy().to_string(), &state, &store) open_project(project_dir.to_string_lossy().to_string(), &state, &store, 3001)
.await .await
.unwrap(); .unwrap();
@@ -1370,7 +1440,7 @@ mod tests {
let store = make_store(&dir); let store = make_store(&dir);
let state = SessionState::default(); let state = SessionState::default();
open_project(project_dir.to_string_lossy().to_string(), &state, &store) open_project(project_dir.to_string_lossy().to_string(), &state, &store, 3001)
.await .await
.unwrap(); .unwrap();
@@ -1572,7 +1642,7 @@ mod tests {
) )
.unwrap(); .unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let content = fs::read_to_string(dir.path().join(".storkit/project.toml")).unwrap(); let content = fs::read_to_string(dir.path().join(".storkit/project.toml")).unwrap();
assert!( assert!(
@@ -1592,7 +1662,7 @@ mod tests {
#[test] #[test]
fn scaffold_project_toml_fallback_when_no_stack_detected() { fn scaffold_project_toml_fallback_when_no_stack_detected() {
let dir = tempdir().unwrap(); let dir = tempdir().unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let content = fs::read_to_string(dir.path().join(".storkit/project.toml")).unwrap(); let content = fs::read_to_string(dir.path().join(".storkit/project.toml")).unwrap();
assert!( assert!(
@@ -1614,7 +1684,7 @@ mod tests {
let existing = "[[component]]\nname = \"custom\"\npath = \".\"\nsetup = [\"make build\"]\n"; let existing = "[[component]]\nname = \"custom\"\npath = \".\"\nsetup = [\"make build\"]\n";
fs::write(sk_dir.join("project.toml"), existing).unwrap(); fs::write(sk_dir.join("project.toml"), existing).unwrap();
scaffold_story_kit(dir.path()).unwrap(); scaffold_story_kit(dir.path(), 3001).unwrap();
let content = fs::read_to_string(sk_dir.join("project.toml")).unwrap(); let content = fs::read_to_string(sk_dir.join("project.toml")).unwrap();
assert_eq!( assert_eq!(

View File

@@ -34,6 +34,32 @@ use std::path::PathBuf;
use std::sync::Arc; use std::sync::Arc;
use tokio::sync::broadcast; use tokio::sync::broadcast;
/// What the first CLI argument means.
#[derive(Debug, PartialEq)]
enum CliDirective {
/// `--help` / `-h`
Help,
/// `--version` / `-V`
Version,
/// An unrecognised flag (starts with `-`).
UnknownFlag(String),
/// A positional path argument.
Path,
/// No arguments at all.
None,
}
/// Inspect the raw CLI arguments and return the directive they imply.
fn classify_cli_args(args: &[String]) -> CliDirective {
match args.first().map(String::as_str) {
None => CliDirective::None,
Some("--help" | "-h") => CliDirective::Help,
Some("--version" | "-V") => CliDirective::Version,
Some(a) if a.starts_with('-') => CliDirective::UnknownFlag(a.to_string()),
Some(_) => CliDirective::Path,
}
}
/// Resolve the optional positional path argument (everything after the binary /// Resolve the optional positional path argument (everything after the binary
/// name) into an absolute `PathBuf`. Returns `None` when no argument was /// name) into an absolute `PathBuf`. Returns `None` when no argument was
/// supplied so that the caller can fall back to the auto-detect behaviour. /// supplied so that the caller can fall back to the auto-detect behaviour.
@@ -53,8 +79,61 @@ async fn main() -> Result<(), std::io::Error> {
// Collect CLI args, skipping the binary name (argv[0]). // Collect CLI args, skipping the binary name (argv[0]).
let cli_args: Vec<String> = std::env::args().skip(1).collect(); let cli_args: Vec<String> = std::env::args().skip(1).collect();
// Handle CLI flags before treating anything as a project path.
match classify_cli_args(&cli_args) {
CliDirective::Help => {
println!("storkit [PATH]");
println!();
println!("Serve a storkit project.");
println!();
println!("USAGE:");
println!(" storkit [PATH]");
println!();
println!("ARGS:");
println!(
" PATH Path to an existing project directory. \
If omitted, storkit searches parent directories for a .storkit/ root."
);
println!();
println!("OPTIONS:");
println!(" -h, --help Print this help and exit");
println!(" -V, --version Print the version and exit");
std::process::exit(0);
}
CliDirective::Version => {
println!("storkit {}", env!("CARGO_PKG_VERSION"));
std::process::exit(0);
}
CliDirective::UnknownFlag(flag) => {
eprintln!("error: unknown option: {flag}");
eprintln!("Run 'storkit --help' for usage.");
std::process::exit(1);
}
CliDirective::Path | CliDirective::None => {}
}
let explicit_path = parse_project_path_arg(&cli_args, &cwd); let explicit_path = parse_project_path_arg(&cli_args, &cwd);
// When a path is given explicitly on the CLI, it must already exist as a
// directory. We do not create directories from the command line.
if let Some(ref path) = explicit_path {
if !path.exists() {
eprintln!(
"error: path does not exist: {}",
path.display()
);
std::process::exit(1);
}
if !path.is_dir() {
eprintln!(
"error: path is not a directory: {}",
path.display()
);
std::process::exit(1);
}
}
if let Some(explicit_root) = explicit_path { if let Some(explicit_root) = explicit_path {
// An explicit path was given on the command line. // An explicit path was given on the command line.
// Open it directly — scaffold .storkit/ if it is missing — and // Open it directly — scaffold .storkit/ if it is missing — and
@@ -63,6 +142,7 @@ async fn main() -> Result<(), std::io::Error> {
explicit_root.to_string_lossy().to_string(), explicit_root.to_string_lossy().to_string(),
&app_state, &app_state,
store.as_ref(), store.as_ref(),
port,
) )
.await .await
{ {
@@ -85,6 +165,7 @@ async fn main() -> Result<(), std::io::Error> {
project_root.to_string_lossy().to_string(), project_root.to_string_lossy().to_string(),
&app_state, &app_state,
store.as_ref(), store.as_ref(),
port,
) )
.await .await
.unwrap_or_else(|e| { .unwrap_or_else(|e| {
@@ -399,6 +480,61 @@ name = "coder"
.unwrap_or_else(|e| panic!("Invalid project.toml: {e}")); .unwrap_or_else(|e| panic!("Invalid project.toml: {e}"));
} }
// ── classify_cli_args ─────────────────────────────────────────────────
#[test]
fn classify_none_when_no_args() {
assert_eq!(classify_cli_args(&[]), CliDirective::None);
}
#[test]
fn classify_help_long() {
assert_eq!(
classify_cli_args(&["--help".to_string()]),
CliDirective::Help
);
}
#[test]
fn classify_help_short() {
assert_eq!(
classify_cli_args(&["-h".to_string()]),
CliDirective::Help
);
}
#[test]
fn classify_version_long() {
assert_eq!(
classify_cli_args(&["--version".to_string()]),
CliDirective::Version
);
}
#[test]
fn classify_version_short() {
assert_eq!(
classify_cli_args(&["-V".to_string()]),
CliDirective::Version
);
}
#[test]
fn classify_unknown_flag() {
assert_eq!(
classify_cli_args(&["--serve".to_string()]),
CliDirective::UnknownFlag("--serve".to_string())
);
}
#[test]
fn classify_path() {
assert_eq!(
classify_cli_args(&["/some/path".to_string()]),
CliDirective::Path
);
}
// ── parse_project_path_arg ──────────────────────────────────────────── // ── parse_project_path_arg ────────────────────────────────────────────
#[test] #[test]

View File

@@ -189,23 +189,6 @@ mod tests {
use crate::transport::MessageId; use crate::transport::MessageId;
use std::sync::Mutex; use std::sync::Mutex;
// ── AC: docker-compose.yml specifies runtime: runsc ──────────────────
// docker-compose.yml embedded at compile time for a hermetic test.
const DOCKER_COMPOSE_YML: &str =
include_str!(concat!(env!("CARGO_MANIFEST_DIR"), "/../docker/docker-compose.yml"));
/// The docker-compose.yml must opt the container into the gVisor runtime
/// so that all container syscalls are intercepted in userspace.
#[test]
fn docker_compose_specifies_runsc_runtime() {
assert!(
DOCKER_COMPOSE_YML.contains("runtime: runsc"),
"docker/docker-compose.yml must contain `runtime: runsc` \
to enable gVisor sandboxing"
);
}
/// In-memory transport that records sent messages. /// In-memory transport that records sent messages.
struct CapturingTransport { struct CapturingTransport {
sent: Mutex<Vec<(String, String)>>, sent: Mutex<Vec<(String, String)>>,