- Add llm_call_log table and per-call timing/token tracking in agent loop
- New GET /workflows/{id}/plan endpoint to restore plan from snapshots on page load
- New GET /workflows/{id}/llm-calls endpoint + WS LlmCallLog broadcast
- Parse Usage from LLM API response (prompt_tokens, completion_tokens)
- Detailed mode toggle in execution log showing LLM call cards with phase/tokens/latency
- Quote-to-feedback: hover quote buttons on plan steps and log entries, multi-quote chips in comment input
- Requirement input: larger textarea, multi-line display with pre-wrap and scroll
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Tori — AI Agent 工作流管理器
AI agent 驱动的工作流管理 Web 应用。描述需求,AI 规划,agent 执行,随时通过 comment 反馈。
快速开始
# 开发模式(前后端同时启动)
make dev
# 构建生产版本
make build
# 部署到 OCI ARM 服务器
make deploy
配置
cp config.yaml.example config.yaml
# 编辑 config.yaml,填入 LLM API key 等
技术栈
- 后端: Rust (Axum) + SQLite
- 前端: Vite + Vue 3 + TypeScript
- LLM: OpenAI 兼容 API(Requesty.ai 网关)
- 实时通信: WebSocket
- 远程执行: SSH
Description
Languages
Rust
63.7%
Vue
28.1%
Python
3.4%
TypeScript
2.3%
HTML
1.7%
Other
0.8%