From 9622a1a57268b13e0b757f56c68400452e054f1d Mon Sep 17 00:00:00 2001 From: Dave Date: Fri, 20 Mar 2026 09:16:10 +0000 Subject: [PATCH] story-kit: create 344_story_chatgpt_agent_backend_via_openai_api --- .../344_story_chatgpt_agent_backend_via_openai_api.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/.story_kit/work/1_backlog/344_story_chatgpt_agent_backend_via_openai_api.md b/.story_kit/work/1_backlog/344_story_chatgpt_agent_backend_via_openai_api.md index d173225..69d9af2 100644 --- a/.story_kit/work/1_backlog/344_story_chatgpt_agent_backend_via_openai_api.md +++ b/.story_kit/work/1_backlog/344_story_chatgpt_agent_backend_via_openai_api.md @@ -13,7 +13,8 @@ As a project owner, I want to run agents using ChatGPT (GPT-4o, o3, etc.) via th - [ ] Implement OpenAiRuntime using the AgentRuntime trait from refactor 343 - [ ] Supports GPT-4o and o3 models via the OpenAI chat completions API - [ ] Manages a conversation loop: send prompt + tool definitions, execute tool calls, continue until done -- [ ] Tool definitions map to the same tools agents currently use (file read/write, bash, grep, etc.) +- [ ] Agents connect to storkit's MCP server for all tool operations — no custom file/bash tools needed +- [ ] MCP tool definitions are converted to OpenAI function calling format - [ ] Configurable in project.toml: runtime = 'openai', model = 'gpt-4o' - [ ] OPENAI_API_KEY passed via environment variable - [ ] Token usage tracked and logged to token_usage.jsonl