Add global knowledge base with RAG search
- KB module: fastembed (AllMiniLML6V2) for CPU embedding, SQLite for vector storage with brute-force cosine similarity search - Chunking by ## headings, embeddings stored as BLOB in kb_chunks table - API: GET/PUT /api/kb for full-text read/write with auto re-indexing - Agent tools: kb_search (top-5 semantic search) and kb_read (full text) available in both planning and execution phases - Frontend: Settings menu in sidebar footer, KB editor as independent view with markdown textarea and save button - Also: extract shared db_err/ApiResult to api/mod.rs, add context management design doc Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
parent
1aa81896b5
commit
d9d3bc340c
1631
Cargo.lock
generated
1631
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
@ -26,3 +26,4 @@ uuid = { version = "1", features = ["v4"] }
|
|||||||
anyhow = "1"
|
anyhow = "1"
|
||||||
mime_guess = "2"
|
mime_guess = "2"
|
||||||
nix = { version = "0.29", features = ["signal"] }
|
nix = { version = "0.29", features = ["signal"] }
|
||||||
|
fastembed = "5"
|
||||||
|
|||||||
54
doc/context.md
Normal file
54
doc/context.md
Normal file
@ -0,0 +1,54 @@
|
|||||||
|
# Context 管理现状与设计
|
||||||
|
|
||||||
|
## 现状
|
||||||
|
|
||||||
|
当前没有做 context 长度限制,存在超过 model token limit 的风险。
|
||||||
|
|
||||||
|
### 已有的缓解机制
|
||||||
|
|
||||||
|
1. **Phase transition 时 clear**:`step_messages` 在 planning→executing 和 step→step 切换时会 `clear()`,避免跨阶段累积
|
||||||
|
2. **单条 tool output 截断**:bash 输出限制 8000 bytes,read_file 超长时也截断
|
||||||
|
3. **Step context 摘要**:已完成步骤只保留 summary(`step_summaries`),不带完整输出
|
||||||
|
|
||||||
|
### 风险场景
|
||||||
|
|
||||||
|
- 一个 execution step 内 tool call 轮次过多(反复 bash、read_file),`step_messages` 无限增长
|
||||||
|
- 每轮 LLM 的 assistant message + tool result 都 push 进 `step_messages`,没有上限
|
||||||
|
- 最终整个 messages 数组超过模型 context window
|
||||||
|
|
||||||
|
## 方案设计
|
||||||
|
|
||||||
|
### 策略:滑动窗口 + 早期消息摘要
|
||||||
|
|
||||||
|
当 `step_messages` 长度超过阈值时,保留最近 N 轮完整对话,早期的 tool call/result 对折叠为一条摘要消息。
|
||||||
|
|
||||||
|
```
|
||||||
|
[system prompt]
|
||||||
|
[user: step context]
|
||||||
|
[summary of early tool interactions] ← 压缩后的历史
|
||||||
|
[recent assistant + tool messages] ← 完整保留最近 N 轮
|
||||||
|
```
|
||||||
|
|
||||||
|
### 具体实现
|
||||||
|
|
||||||
|
1. **Token 估算**:用字符数粗估(1 token ≈ 3-4 chars 中英混合),不需要精确 tokenizer
|
||||||
|
2. **阈值**:可配置,默认如 80000 chars(约 20k-25k tokens),给 system prompt 和 response 留余量
|
||||||
|
3. **压缩触发**:每次构建 messages 时检查总长度,超过阈值则压缩
|
||||||
|
4. **压缩方式**:
|
||||||
|
- 简单版:直接丢弃早期 tool call/result 对,替换为 `[已执行 N 次工具调用,最近结果见下文]`
|
||||||
|
- 进阶版:用 LLM 生成摘要(额外一次 API 调用,但质量更好)
|
||||||
|
5. **不压缩的部分**:system prompt、user context、最近 2-3 轮完整交互
|
||||||
|
|
||||||
|
### 实现位置
|
||||||
|
|
||||||
|
在 `run_agent_loop` 中构建 messages 之后、调用 LLM 之前,插入压缩逻辑:
|
||||||
|
|
||||||
|
```rust
|
||||||
|
// agent.rs run_agent_loop 内,约 L706-L725
|
||||||
|
let (mut messages, tools) = match &state.phase { ... };
|
||||||
|
|
||||||
|
// 压缩 context
|
||||||
|
compact_messages(&mut messages, MAX_CONTEXT_CHARS);
|
||||||
|
```
|
||||||
|
|
||||||
|
`compact_messages` 函数:从前往后扫描,保留 system/user 头部,计算总长度,超限时将早期 assistant+tool 消息替换为摘要。
|
||||||
42
doc/kb.md
Normal file
42
doc/kb.md
Normal file
@ -0,0 +1,42 @@
|
|||||||
|
# 知识库 (KB / RAG)
|
||||||
|
|
||||||
|
## 概述
|
||||||
|
|
||||||
|
全局知识库,所有 project 的 agent 共享。用户在前端用 markdown 编辑,保存时自动切块并索引。Agent 通过 `kb_search` 和 `kb_read` 工具查询。
|
||||||
|
|
||||||
|
## 数据流
|
||||||
|
|
||||||
|
```
|
||||||
|
用户编辑 markdown textarea
|
||||||
|
→ PUT /api/kb
|
||||||
|
→ 原文存 SQLite (kb_content 表,单行)
|
||||||
|
→ 按 ## heading 切块
|
||||||
|
→ fastembed (AllMiniLML6V2) 生成 embedding
|
||||||
|
→ chunk + embedding 存 SQLite (kb_chunks 表)
|
||||||
|
```
|
||||||
|
|
||||||
|
## 切块策略
|
||||||
|
|
||||||
|
按 markdown `##` heading 切分,每个 section 作为一个 chunk。无 heading 的开头部分作为一个 chunk。
|
||||||
|
|
||||||
|
## Agent 工具
|
||||||
|
|
||||||
|
- `kb_search(query: str)` → 向量搜索 top-5,返回相关片段
|
||||||
|
- `kb_read()` → 返回 KB 全文
|
||||||
|
|
||||||
|
## API
|
||||||
|
|
||||||
|
- `GET /api/kb` → 返回 KB 全文 `{ content: string }`
|
||||||
|
- `PUT /api/kb` → 保存全文 + 重新切块索引 `{ content: string }`
|
||||||
|
|
||||||
|
## 技术选型
|
||||||
|
|
||||||
|
- **向量存储**: SQLite (embedding 存为 BLOB,暴力余弦搜索)
|
||||||
|
- **Embedding**: fastembed-rs (AllMiniLML6V2, 384 dim, CPU)
|
||||||
|
- **原文存储**: SQLite (kb_content 表)
|
||||||
|
|
||||||
|
## 前端
|
||||||
|
|
||||||
|
- Sidebar 底部 Settings 弹出菜单 → Knowledge Base
|
||||||
|
- 点击切换到 KB 编辑独立 view(与 project view 平级)
|
||||||
|
- 大 textarea + Save 按钮
|
||||||
@ -9,10 +9,3 @@ template
|
|||||||
---
|
---
|
||||||
时间观察app
|
时间观察app
|
||||||
---
|
---
|
||||||
|
|
||||||
## 代码啰嗦/可精简
|
|
||||||
|
|
||||||
- **agent.rs**:`NewRequirement` 与 `Comment` 分支里「设 final_status → 更新 DB status → broadcast WorkflowStatusUpdate → 查 all_steps → generate_report → 更新 report → broadcast ReportReady」几乎相同,可抽成共用函数(如 `finish_workflow_and_report`);venv 创建/检查(create_dir_all + .venv 存在 + uv venv)两处重复,可抽成 helper。
|
|
||||||
- **api/**:`projects.rs`、`workflows.rs`、`timers.rs` 里 `db_err` 与 `ApiResult<T>` 定义重复,可提到 `api/mod.rs` 或公共模块。
|
|
||||||
- **WorkflowView.vue**:`handleWsMessage` 里多处 `workflow.value && msg.workflow_id === workflow.value.id`,可先取 `const wf = workflow.value` 并统一判断;`ReportReady` 分支里 `workflow.value = { ...workflow.value, status: workflow.value.status }` 无实际效果,可删或改成真正刷新。
|
|
||||||
- **PlanSection.vue / ExecutionSection.vue**:都有 `expandedSteps`(Set)、`toggleStep`、以及 status→icon/label 的映射,可考虑抽成 composable 或共享 util 减少重复。
|
|
||||||
80
src/agent.rs
80
src/agent.rs
@ -87,10 +87,11 @@ pub struct AgentManager {
|
|||||||
next_port: AtomicU16,
|
next_port: AtomicU16,
|
||||||
pool: SqlitePool,
|
pool: SqlitePool,
|
||||||
llm_config: LlmConfig,
|
llm_config: LlmConfig,
|
||||||
|
kb: Option<Arc<crate::kb::KbManager>>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl AgentManager {
|
impl AgentManager {
|
||||||
pub fn new(pool: SqlitePool, llm_config: LlmConfig) -> Arc<Self> {
|
pub fn new(pool: SqlitePool, llm_config: LlmConfig, kb: Option<Arc<crate::kb::KbManager>>) -> Arc<Self> {
|
||||||
Arc::new(Self {
|
Arc::new(Self {
|
||||||
agents: RwLock::new(HashMap::new()),
|
agents: RwLock::new(HashMap::new()),
|
||||||
broadcast: RwLock::new(HashMap::new()),
|
broadcast: RwLock::new(HashMap::new()),
|
||||||
@ -98,6 +99,7 @@ impl AgentManager {
|
|||||||
next_port: AtomicU16::new(9100),
|
next_port: AtomicU16::new(9100),
|
||||||
pool,
|
pool,
|
||||||
llm_config,
|
llm_config,
|
||||||
|
kb,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -146,6 +148,14 @@ impl AgentManager {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async fn ensure_venv(exec: &LocalExecutor, workdir: &str) {
|
||||||
|
let _ = tokio::fs::create_dir_all(workdir).await;
|
||||||
|
let venv_path = format!("{}/.venv", workdir);
|
||||||
|
if !std::path::Path::new(&venv_path).exists() {
|
||||||
|
let _ = exec.execute("uv venv .venv", workdir).await;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
async fn agent_loop(
|
async fn agent_loop(
|
||||||
project_id: String,
|
project_id: String,
|
||||||
mut rx: mpsc::Receiver<AgentEvent>,
|
mut rx: mpsc::Receiver<AgentEvent>,
|
||||||
@ -196,11 +206,7 @@ async fn agent_loop(
|
|||||||
.await;
|
.await;
|
||||||
|
|
||||||
// Ensure workspace and venv exist
|
// Ensure workspace and venv exist
|
||||||
let _ = tokio::fs::create_dir_all(&workdir).await;
|
ensure_venv(&exec, &workdir).await;
|
||||||
let venv_path = format!("{}/.venv", workdir);
|
|
||||||
if !std::path::Path::new(&venv_path).exists() {
|
|
||||||
let _ = exec.execute("uv venv .venv", &workdir).await;
|
|
||||||
}
|
|
||||||
let _ = tokio::fs::write(format!("{}/requirement.md", workdir), &requirement).await;
|
let _ = tokio::fs::write(format!("{}/requirement.md", workdir), &requirement).await;
|
||||||
|
|
||||||
tracing::info!("Starting agent loop for workflow {}", workflow_id);
|
tracing::info!("Starting agent loop for workflow {}", workflow_id);
|
||||||
@ -264,11 +270,7 @@ async fn agent_loop(
|
|||||||
let Some(wf) = wf else { continue };
|
let Some(wf) = wf else { continue };
|
||||||
|
|
||||||
// Ensure venv exists for comment re-runs too
|
// Ensure venv exists for comment re-runs too
|
||||||
let _ = tokio::fs::create_dir_all(&workdir).await;
|
ensure_venv(&exec, &workdir).await;
|
||||||
let venv_path = format!("{}/.venv", workdir);
|
|
||||||
if !std::path::Path::new(&venv_path).exists() {
|
|
||||||
let _ = exec.execute("uv venv .venv", &workdir).await;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Clear old plan steps (keep log entries for history)
|
// Clear old plan steps (keep log entries for history)
|
||||||
let _ = sqlx::query("DELETE FROM plan_steps WHERE workflow_id = ? AND kind = 'plan'")
|
let _ = sqlx::query("DELETE FROM plan_steps WHERE workflow_id = ? AND kind = 'plan'")
|
||||||
@ -375,6 +377,23 @@ fn tool_list_files() -> Tool {
|
|||||||
}))
|
}))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn tool_kb_search() -> Tool {
|
||||||
|
make_tool("kb_search", "搜索知识库中与查询相关的内容片段。返回最相关的 top-5 片段。", serde_json::json!({
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"query": { "type": "string", "description": "搜索查询" }
|
||||||
|
},
|
||||||
|
"required": ["query"]
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn tool_kb_read() -> Tool {
|
||||||
|
make_tool("kb_read", "读取知识库全文内容。", serde_json::json!({
|
||||||
|
"type": "object",
|
||||||
|
"properties": {}
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
fn build_planning_tools() -> Vec<Tool> {
|
fn build_planning_tools() -> Vec<Tool> {
|
||||||
vec![
|
vec![
|
||||||
make_tool("update_plan", "设置高层执行计划。分析需求后调用此工具提交计划。每个步骤应是一个逻辑阶段(不是具体命令),包含简短标题和详细描述。调用后自动进入执行阶段。", serde_json::json!({
|
make_tool("update_plan", "设置高层执行计划。分析需求后调用此工具提交计划。每个步骤应是一个逻辑阶段(不是具体命令),包含简短标题和详细描述。调用后自动进入执行阶段。", serde_json::json!({
|
||||||
@ -397,6 +416,8 @@ fn build_planning_tools() -> Vec<Tool> {
|
|||||||
})),
|
})),
|
||||||
tool_list_files(),
|
tool_list_files(),
|
||||||
tool_read_file(),
|
tool_read_file(),
|
||||||
|
tool_kb_search(),
|
||||||
|
tool_kb_read(),
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -451,6 +472,8 @@ fn build_execution_tools() -> Vec<Tool> {
|
|||||||
},
|
},
|
||||||
"required": ["content"]
|
"required": ["content"]
|
||||||
})),
|
})),
|
||||||
|
tool_kb_search(),
|
||||||
|
tool_kb_read(),
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -481,6 +504,7 @@ fn build_planning_prompt(project_id: &str) -> String {
|
|||||||
- 因此前端 HTML 中的所有 API 请求必须使用【不带开头 / 的相对路径】\n\
|
- 因此前端 HTML 中的所有 API 请求必须使用【不带开头 / 的相对路径】\n\
|
||||||
- 正确示例:fetch('todos') 或 fetch('./todos') 错误示例:fetch('/todos') 或 fetch('/api/todos')\n\
|
- 正确示例:fetch('todos') 或 fetch('./todos') 错误示例:fetch('/todos') 或 fetch('/api/todos')\n\
|
||||||
- HTML 中的 <base> 标签不需要设置,只要不用绝对路径就行\n\
|
- HTML 中的 <base> 标签不需要设置,只要不用绝对路径就行\n\
|
||||||
|
- 知识库工具:kb_search(query) 搜索相关片段,kb_read() 读取全文\n\
|
||||||
\n\
|
\n\
|
||||||
请使用中文回复。",
|
请使用中文回复。",
|
||||||
project_id,
|
project_id,
|
||||||
@ -511,6 +535,7 @@ fn build_execution_prompt(project_id: &str) -> String {
|
|||||||
- 静态文件访问:/api/projects/{0}/files/{{filename}}\n\
|
- 静态文件访问:/api/projects/{0}/files/{{filename}}\n\
|
||||||
- 后台服务访问:/api/projects/{0}/app/(启动命令需监听 0.0.0.0:$PORT)\n\
|
- 后台服务访问:/api/projects/{0}/app/(启动命令需监听 0.0.0.0:$PORT)\n\
|
||||||
- 【重要】应用通过反向代理访问,前端 HTML/JS 中的 fetch/XHR 请求必须使用相对路径(如 fetch('todos')),绝对不能用 / 开头的路径(如 fetch('/todos')),否则会 404\n\
|
- 【重要】应用通过反向代理访问,前端 HTML/JS 中的 fetch/XHR 请求必须使用相对路径(如 fetch('todos')),绝对不能用 / 开头的路径(如 fetch('/todos')),否则会 404\n\
|
||||||
|
- 知识库工具:kb_search(query) 搜索相关片段,kb_read() 读取全文\n\
|
||||||
\n\
|
\n\
|
||||||
请使用中文回复。",
|
请使用中文回复。",
|
||||||
project_id,
|
project_id,
|
||||||
@ -951,6 +976,37 @@ async fn run_agent_loop(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
"kb_search" => {
|
||||||
|
let query = args["query"].as_str().unwrap_or("");
|
||||||
|
let result = if let Some(kb) = &mgr.kb {
|
||||||
|
match kb.search(query).await {
|
||||||
|
Ok(results) if results.is_empty() => "知识库为空或没有匹配结果。".to_string(),
|
||||||
|
Ok(results) => {
|
||||||
|
results.iter().enumerate().map(|(i, r)| {
|
||||||
|
format!("--- 片段 {} (相似度: {:.2}) ---\n{}", i + 1, r.score, r.content)
|
||||||
|
}).collect::<Vec<_>>().join("\n\n")
|
||||||
|
}
|
||||||
|
Err(e) => format!("Error: {}", e),
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
"知识库未初始化。".to_string()
|
||||||
|
};
|
||||||
|
state.step_messages.push(ChatMessage::tool_result(&tc.id, &result));
|
||||||
|
}
|
||||||
|
|
||||||
|
"kb_read" => {
|
||||||
|
let result: String = match sqlx::query_scalar::<_, String>("SELECT content FROM kb_content WHERE id = 1")
|
||||||
|
.fetch_one(pool)
|
||||||
|
.await
|
||||||
|
{
|
||||||
|
Ok(content) => {
|
||||||
|
if content.is_empty() { "知识库为空。".to_string() } else { content }
|
||||||
|
}
|
||||||
|
Err(e) => format!("Error: {}", e),
|
||||||
|
};
|
||||||
|
state.step_messages.push(ChatMessage::tool_result(&tc.id, &result));
|
||||||
|
}
|
||||||
|
|
||||||
// IO tools: execute, read_file, write_file, list_files
|
// IO tools: execute, read_file, write_file, list_files
|
||||||
_ => {
|
_ => {
|
||||||
let current_plan_step_id = match &state.phase {
|
let current_plan_step_id = match &state.phase {
|
||||||
@ -968,6 +1024,8 @@ async fn run_agent_loop(
|
|||||||
"read_file" => format!("Read: {}", args["path"].as_str().unwrap_or("?")),
|
"read_file" => format!("Read: {}", args["path"].as_str().unwrap_or("?")),
|
||||||
"write_file" => format!("Write: {}", args["path"].as_str().unwrap_or("?")),
|
"write_file" => format!("Write: {}", args["path"].as_str().unwrap_or("?")),
|
||||||
"list_files" => format!("List: {}", args["path"].as_str().unwrap_or(".")),
|
"list_files" => format!("List: {}", args["path"].as_str().unwrap_or(".")),
|
||||||
|
"kb_search" => format!("KB Search: {}", args["query"].as_str().unwrap_or("?")),
|
||||||
|
"kb_read" => "KB Read".to_string(),
|
||||||
other => other.to_string(),
|
other => other.to_string(),
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
53
src/api/kb.rs
Normal file
53
src/api/kb.rs
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
use std::sync::Arc;
|
||||||
|
use axum::{
|
||||||
|
extract::State,
|
||||||
|
routing::get,
|
||||||
|
Json, Router,
|
||||||
|
};
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use crate::AppState;
|
||||||
|
use super::{ApiResult, db_err};
|
||||||
|
|
||||||
|
#[derive(Serialize, Deserialize)]
|
||||||
|
pub struct KbContent {
|
||||||
|
pub content: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn router(state: Arc<AppState>) -> Router {
|
||||||
|
Router::new()
|
||||||
|
.route("/kb", get(get_kb).put(put_kb))
|
||||||
|
.with_state(state)
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn get_kb(
|
||||||
|
State(state): State<Arc<AppState>>,
|
||||||
|
) -> ApiResult<KbContent> {
|
||||||
|
let content: String = sqlx::query_scalar("SELECT content FROM kb_content WHERE id = 1")
|
||||||
|
.fetch_one(&state.db.pool)
|
||||||
|
.await
|
||||||
|
.map_err(db_err)?;
|
||||||
|
|
||||||
|
Ok(Json(KbContent { content }))
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn put_kb(
|
||||||
|
State(state): State<Arc<AppState>>,
|
||||||
|
Json(input): Json<KbContent>,
|
||||||
|
) -> ApiResult<KbContent> {
|
||||||
|
sqlx::query("UPDATE kb_content SET content = ?, updated_at = datetime('now') WHERE id = 1")
|
||||||
|
.bind(&input.content)
|
||||||
|
.execute(&state.db.pool)
|
||||||
|
.await
|
||||||
|
.map_err(db_err)?;
|
||||||
|
|
||||||
|
// Re-index
|
||||||
|
if let Some(kb) = &state.kb {
|
||||||
|
if let Err(e) = kb.index(&input.content).await {
|
||||||
|
tracing::error!("KB indexing failed: {}", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(Json(KbContent {
|
||||||
|
content: input.content,
|
||||||
|
}))
|
||||||
|
}
|
||||||
@ -1,3 +1,4 @@
|
|||||||
|
mod kb;
|
||||||
mod projects;
|
mod projects;
|
||||||
mod timers;
|
mod timers;
|
||||||
mod workflows;
|
mod workflows;
|
||||||
@ -9,16 +10,24 @@ use axum::{
|
|||||||
http::StatusCode,
|
http::StatusCode,
|
||||||
response::{IntoResponse, Response},
|
response::{IntoResponse, Response},
|
||||||
routing::{get, any},
|
routing::{get, any},
|
||||||
Router,
|
Json, Router,
|
||||||
};
|
};
|
||||||
|
|
||||||
use crate::AppState;
|
use crate::AppState;
|
||||||
|
|
||||||
|
pub(crate) type ApiResult<T> = Result<Json<T>, Response>;
|
||||||
|
|
||||||
|
pub(crate) fn db_err(e: sqlx::Error) -> Response {
|
||||||
|
tracing::error!("Database error: {}", e);
|
||||||
|
(StatusCode::INTERNAL_SERVER_ERROR, e.to_string()).into_response()
|
||||||
|
}
|
||||||
|
|
||||||
pub fn router(state: Arc<AppState>) -> Router {
|
pub fn router(state: Arc<AppState>) -> Router {
|
||||||
Router::new()
|
Router::new()
|
||||||
.merge(projects::router(state.clone()))
|
.merge(projects::router(state.clone()))
|
||||||
.merge(workflows::router(state.clone()))
|
.merge(workflows::router(state.clone()))
|
||||||
.merge(timers::router(state.clone()))
|
.merge(timers::router(state.clone()))
|
||||||
|
.merge(kb::router(state.clone()))
|
||||||
.route("/projects/{id}/files/{*path}", get(serve_project_file))
|
.route("/projects/{id}/files/{*path}", get(serve_project_file))
|
||||||
.route("/projects/{id}/app/{*path}", any(proxy_to_service).with_state(state.clone()))
|
.route("/projects/{id}/app/{*path}", any(proxy_to_service).with_state(state.clone()))
|
||||||
.route("/projects/{id}/app/", any(proxy_to_service_root).with_state(state))
|
.route("/projects/{id}/app/", any(proxy_to_service_root).with_state(state))
|
||||||
|
|||||||
@ -1,21 +1,13 @@
|
|||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use axum::{
|
use axum::{
|
||||||
extract::{Path, State},
|
extract::{Path, State},
|
||||||
http::StatusCode,
|
|
||||||
response::{IntoResponse, Response},
|
|
||||||
routing::get,
|
routing::get,
|
||||||
Json, Router,
|
Json, Router,
|
||||||
};
|
};
|
||||||
use serde::Deserialize;
|
use serde::Deserialize;
|
||||||
use crate::AppState;
|
use crate::AppState;
|
||||||
use crate::db::Project;
|
use crate::db::Project;
|
||||||
|
use super::{ApiResult, db_err};
|
||||||
type ApiResult<T> = Result<Json<T>, Response>;
|
|
||||||
|
|
||||||
fn db_err(e: sqlx::Error) -> Response {
|
|
||||||
tracing::error!("Database error: {}", e);
|
|
||||||
(StatusCode::INTERNAL_SERVER_ERROR, e.to_string()).into_response()
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Deserialize)]
|
#[derive(Deserialize)]
|
||||||
pub struct CreateProject {
|
pub struct CreateProject {
|
||||||
|
|||||||
@ -9,13 +9,7 @@ use axum::{
|
|||||||
use serde::Deserialize;
|
use serde::Deserialize;
|
||||||
use crate::AppState;
|
use crate::AppState;
|
||||||
use crate::db::Timer;
|
use crate::db::Timer;
|
||||||
|
use super::{ApiResult, db_err};
|
||||||
type ApiResult<T> = Result<Json<T>, Response>;
|
|
||||||
|
|
||||||
fn db_err(e: sqlx::Error) -> Response {
|
|
||||||
tracing::error!("Database error: {}", e);
|
|
||||||
(StatusCode::INTERNAL_SERVER_ERROR, e.to_string()).into_response()
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Deserialize)]
|
#[derive(Deserialize)]
|
||||||
pub struct CreateTimer {
|
pub struct CreateTimer {
|
||||||
|
|||||||
@ -10,19 +10,13 @@ use serde::Deserialize;
|
|||||||
use crate::AppState;
|
use crate::AppState;
|
||||||
use crate::agent::AgentEvent;
|
use crate::agent::AgentEvent;
|
||||||
use crate::db::{Workflow, PlanStep, Comment};
|
use crate::db::{Workflow, PlanStep, Comment};
|
||||||
|
use super::{ApiResult, db_err};
|
||||||
|
|
||||||
#[derive(serde::Serialize)]
|
#[derive(serde::Serialize)]
|
||||||
struct ReportResponse {
|
struct ReportResponse {
|
||||||
report: String,
|
report: String,
|
||||||
}
|
}
|
||||||
|
|
||||||
type ApiResult<T> = Result<Json<T>, Response>;
|
|
||||||
|
|
||||||
fn db_err(e: sqlx::Error) -> Response {
|
|
||||||
tracing::error!("Database error: {}", e);
|
|
||||||
(StatusCode::INTERNAL_SERVER_ERROR, e.to_string()).into_response()
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Deserialize)]
|
#[derive(Deserialize)]
|
||||||
pub struct CreateWorkflow {
|
pub struct CreateWorkflow {
|
||||||
pub requirement: String,
|
pub requirement: String,
|
||||||
|
|||||||
29
src/db.rs
29
src/db.rs
@ -101,6 +101,35 @@ impl Database {
|
|||||||
.execute(&self.pool)
|
.execute(&self.pool)
|
||||||
.await;
|
.await;
|
||||||
|
|
||||||
|
// KB tables
|
||||||
|
sqlx::query(
|
||||||
|
"CREATE TABLE IF NOT EXISTS kb_content (
|
||||||
|
id INTEGER PRIMARY KEY CHECK (id = 1),
|
||||||
|
content TEXT NOT NULL DEFAULT '',
|
||||||
|
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||||
|
)"
|
||||||
|
)
|
||||||
|
.execute(&self.pool)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
// Insert default row if not exists
|
||||||
|
let _ = sqlx::query(
|
||||||
|
"INSERT OR IGNORE INTO kb_content (id, content) VALUES (1, '')"
|
||||||
|
)
|
||||||
|
.execute(&self.pool)
|
||||||
|
.await;
|
||||||
|
|
||||||
|
sqlx::query(
|
||||||
|
"CREATE TABLE IF NOT EXISTS kb_chunks (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
title TEXT NOT NULL DEFAULT '',
|
||||||
|
content TEXT NOT NULL,
|
||||||
|
embedding BLOB NOT NULL
|
||||||
|
)"
|
||||||
|
)
|
||||||
|
.execute(&self.pool)
|
||||||
|
.await?;
|
||||||
|
|
||||||
sqlx::query(
|
sqlx::query(
|
||||||
"CREATE TABLE IF NOT EXISTS timers (
|
"CREATE TABLE IF NOT EXISTS timers (
|
||||||
id TEXT PRIMARY KEY,
|
id TEXT PRIMARY KEY,
|
||||||
|
|||||||
167
src/kb.rs
Normal file
167
src/kb.rs
Normal file
@ -0,0 +1,167 @@
|
|||||||
|
use anyhow::Result;
|
||||||
|
use sqlx::sqlite::SqlitePool;
|
||||||
|
use std::sync::Mutex;
|
||||||
|
|
||||||
|
const TOP_K: usize = 5;
|
||||||
|
|
||||||
|
pub struct KbManager {
|
||||||
|
embedder: Mutex<fastembed::TextEmbedding>,
|
||||||
|
pool: SqlitePool,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone)]
|
||||||
|
pub struct SearchResult {
|
||||||
|
pub title: String,
|
||||||
|
pub content: String,
|
||||||
|
pub score: f32,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// A chunk of KB content split by heading
|
||||||
|
#[derive(Debug, Clone)]
|
||||||
|
struct Chunk {
|
||||||
|
title: String,
|
||||||
|
content: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl KbManager {
|
||||||
|
pub fn new(pool: SqlitePool) -> Result<Self> {
|
||||||
|
let embedder = fastembed::TextEmbedding::try_new(
|
||||||
|
fastembed::InitOptions::new(fastembed::EmbeddingModel::AllMiniLML6V2)
|
||||||
|
.with_show_download_progress(true),
|
||||||
|
)?;
|
||||||
|
Ok(Self { embedder: Mutex::new(embedder), pool })
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Re-index: chunk the content, embed, store in SQLite
|
||||||
|
pub async fn index(&self, content: &str) -> Result<()> {
|
||||||
|
// Clear old chunks
|
||||||
|
sqlx::query("DELETE FROM kb_chunks")
|
||||||
|
.execute(&self.pool)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
let chunks = split_chunks(content);
|
||||||
|
if chunks.is_empty() {
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let texts: Vec<String> = chunks.iter().map(|c| c.content.clone()).collect();
|
||||||
|
let embeddings = self.embedder.lock().unwrap().embed(texts, None)?;
|
||||||
|
|
||||||
|
for (chunk, embedding) in chunks.iter().zip(embeddings.into_iter()) {
|
||||||
|
let vec_bytes = embedding_to_bytes(&embedding);
|
||||||
|
sqlx::query(
|
||||||
|
"INSERT INTO kb_chunks (id, title, content, embedding) VALUES (?, ?, ?, ?)",
|
||||||
|
)
|
||||||
|
.bind(uuid::Uuid::new_v4().to_string())
|
||||||
|
.bind(&chunk.title)
|
||||||
|
.bind(&chunk.content)
|
||||||
|
.bind(&vec_bytes)
|
||||||
|
.execute(&self.pool)
|
||||||
|
.await?;
|
||||||
|
}
|
||||||
|
|
||||||
|
tracing::info!("KB indexed: {} chunks", chunks.len());
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Search KB by query, returns top-k results
|
||||||
|
pub async fn search(&self, query: &str) -> Result<Vec<SearchResult>> {
|
||||||
|
let query_embeddings = self.embedder.lock().unwrap().embed(vec![query.to_string()], None)?;
|
||||||
|
let query_vec = query_embeddings
|
||||||
|
.into_iter()
|
||||||
|
.next()
|
||||||
|
.ok_or_else(|| anyhow::anyhow!("Failed to embed query"))?;
|
||||||
|
|
||||||
|
// Fetch all chunks with embeddings
|
||||||
|
let rows: Vec<(String, String, Vec<u8>)> =
|
||||||
|
sqlx::query_as("SELECT title, content, embedding FROM kb_chunks")
|
||||||
|
.fetch_all(&self.pool)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
// Compute cosine similarity and rank
|
||||||
|
let mut scored: Vec<(f32, String, String)> = rows
|
||||||
|
.into_iter()
|
||||||
|
.filter_map(|(title, content, blob)| {
|
||||||
|
let emb = bytes_to_embedding(&blob);
|
||||||
|
let score = cosine_similarity(&query_vec, &emb);
|
||||||
|
Some((score, title, content))
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
scored.sort_by(|a, b| b.0.partial_cmp(&a.0).unwrap_or(std::cmp::Ordering::Equal));
|
||||||
|
scored.truncate(TOP_K);
|
||||||
|
|
||||||
|
Ok(scored
|
||||||
|
.into_iter()
|
||||||
|
.map(|(score, title, content)| SearchResult {
|
||||||
|
title,
|
||||||
|
content,
|
||||||
|
score,
|
||||||
|
})
|
||||||
|
.collect())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn cosine_similarity(a: &[f32], b: &[f32]) -> f32 {
|
||||||
|
let dot: f32 = a.iter().zip(b.iter()).map(|(x, y)| x * y).sum();
|
||||||
|
let norm_a: f32 = a.iter().map(|x| x * x).sum::<f32>().sqrt();
|
||||||
|
let norm_b: f32 = b.iter().map(|x| x * x).sum::<f32>().sqrt();
|
||||||
|
if norm_a == 0.0 || norm_b == 0.0 {
|
||||||
|
return 0.0;
|
||||||
|
}
|
||||||
|
dot / (norm_a * norm_b)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn embedding_to_bytes(embedding: &[f32]) -> Vec<u8> {
|
||||||
|
embedding.iter().flat_map(|f| f.to_le_bytes()).collect()
|
||||||
|
}
|
||||||
|
|
||||||
|
fn bytes_to_embedding(bytes: &[u8]) -> Vec<f32> {
|
||||||
|
bytes
|
||||||
|
.chunks_exact(4)
|
||||||
|
.map(|chunk| f32::from_le_bytes([chunk[0], chunk[1], chunk[2], chunk[3]]))
|
||||||
|
.collect()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Split markdown content into chunks by ## headings
|
||||||
|
fn split_chunks(content: &str) -> Vec<Chunk> {
|
||||||
|
let mut chunks = Vec::new();
|
||||||
|
let mut current_title = String::new();
|
||||||
|
let mut current_lines: Vec<&str> = Vec::new();
|
||||||
|
|
||||||
|
for line in content.lines() {
|
||||||
|
if line.starts_with("## ") {
|
||||||
|
// Save previous chunk
|
||||||
|
let text = current_lines.join("\n").trim().to_string();
|
||||||
|
if !text.is_empty() {
|
||||||
|
chunks.push(Chunk {
|
||||||
|
title: current_title.clone(),
|
||||||
|
content: if current_title.is_empty() {
|
||||||
|
text
|
||||||
|
} else {
|
||||||
|
format!("## {}\n{}", current_title, text)
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
current_title = line.trim_start_matches("## ").trim().to_string();
|
||||||
|
current_lines.clear();
|
||||||
|
} else {
|
||||||
|
current_lines.push(line);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Last chunk
|
||||||
|
let text = current_lines.join("\n").trim().to_string();
|
||||||
|
if !text.is_empty() {
|
||||||
|
chunks.push(Chunk {
|
||||||
|
title: current_title.clone(),
|
||||||
|
content: if current_title.is_empty() {
|
||||||
|
text
|
||||||
|
} else {
|
||||||
|
format!("## {}\n{}", current_title, text)
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
chunks
|
||||||
|
}
|
||||||
16
src/main.rs
16
src/main.rs
@ -1,6 +1,7 @@
|
|||||||
mod api;
|
mod api;
|
||||||
mod agent;
|
mod agent;
|
||||||
mod db;
|
mod db;
|
||||||
|
mod kb;
|
||||||
mod llm;
|
mod llm;
|
||||||
mod exec;
|
mod exec;
|
||||||
mod timer;
|
mod timer;
|
||||||
@ -15,6 +16,7 @@ pub struct AppState {
|
|||||||
pub db: db::Database,
|
pub db: db::Database,
|
||||||
pub config: Config,
|
pub config: Config,
|
||||||
pub agent_mgr: Arc<agent::AgentManager>,
|
pub agent_mgr: Arc<agent::AgentManager>,
|
||||||
|
pub kb: Option<Arc<kb::KbManager>>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, serde::Deserialize)]
|
#[derive(Debug, Clone, serde::Deserialize)]
|
||||||
@ -56,9 +58,22 @@ async fn main() -> anyhow::Result<()> {
|
|||||||
let database = db::Database::new(&config.database.path).await?;
|
let database = db::Database::new(&config.database.path).await?;
|
||||||
database.migrate().await?;
|
database.migrate().await?;
|
||||||
|
|
||||||
|
// Initialize KB manager
|
||||||
|
let kb_arc = match kb::KbManager::new(database.pool.clone()) {
|
||||||
|
Ok(kb) => {
|
||||||
|
tracing::info!("KB manager initialized");
|
||||||
|
Some(Arc::new(kb))
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
tracing::warn!("KB manager init failed (will retry on use): {}", e);
|
||||||
|
None
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
let agent_mgr = agent::AgentManager::new(
|
let agent_mgr = agent::AgentManager::new(
|
||||||
database.pool.clone(),
|
database.pool.clone(),
|
||||||
config.llm.clone(),
|
config.llm.clone(),
|
||||||
|
kb_arc.clone(),
|
||||||
);
|
);
|
||||||
|
|
||||||
timer::start_timer_runner(database.pool.clone(), agent_mgr.clone());
|
timer::start_timer_runner(database.pool.clone(), agent_mgr.clone());
|
||||||
@ -67,6 +82,7 @@ async fn main() -> anyhow::Result<()> {
|
|||||||
db: database,
|
db: database,
|
||||||
config: config.clone(),
|
config: config.clone(),
|
||||||
agent_mgr: agent_mgr.clone(),
|
agent_mgr: agent_mgr.clone(),
|
||||||
|
kb: kb_arc,
|
||||||
});
|
});
|
||||||
|
|
||||||
let app = Router::new()
|
let app = Router::new()
|
||||||
|
|||||||
@ -75,4 +75,12 @@ export const api = {
|
|||||||
|
|
||||||
deleteTimer: (timerId: string) =>
|
deleteTimer: (timerId: string) =>
|
||||||
request<void>(`/timers/${timerId}`, { method: 'DELETE' }),
|
request<void>(`/timers/${timerId}`, { method: 'DELETE' }),
|
||||||
|
|
||||||
|
getKb: () => request<{ content: string }>('/kb'),
|
||||||
|
|
||||||
|
putKb: (content: string) =>
|
||||||
|
request<{ content: string }>('/kb', {
|
||||||
|
method: 'PUT',
|
||||||
|
body: JSON.stringify({ content }),
|
||||||
|
}),
|
||||||
}
|
}
|
||||||
|
|||||||
@ -4,6 +4,7 @@ import Sidebar from './Sidebar.vue'
|
|||||||
import WorkflowView from './WorkflowView.vue'
|
import WorkflowView from './WorkflowView.vue'
|
||||||
import ReportView from './ReportView.vue'
|
import ReportView from './ReportView.vue'
|
||||||
import CreateForm from './CreateForm.vue'
|
import CreateForm from './CreateForm.vue'
|
||||||
|
import KbEditor from './KbEditor.vue'
|
||||||
import { api } from '../api'
|
import { api } from '../api'
|
||||||
import type { Project } from '../types'
|
import type { Project } from '../types'
|
||||||
|
|
||||||
@ -12,6 +13,7 @@ const selectedProjectId = ref('')
|
|||||||
const reportWorkflowId = ref('')
|
const reportWorkflowId = ref('')
|
||||||
const error = ref('')
|
const error = ref('')
|
||||||
const creating = ref(false)
|
const creating = ref(false)
|
||||||
|
const showKb = ref(false)
|
||||||
|
|
||||||
const isReportPage = computed(() => !!reportWorkflowId.value)
|
const isReportPage = computed(() => !!reportWorkflowId.value)
|
||||||
|
|
||||||
@ -54,6 +56,7 @@ function onSelectProject(id: string) {
|
|||||||
selectedProjectId.value = id
|
selectedProjectId.value = id
|
||||||
reportWorkflowId.value = ''
|
reportWorkflowId.value = ''
|
||||||
creating.value = false
|
creating.value = false
|
||||||
|
showKb.value = false
|
||||||
history.pushState(null, '', `/projects/${id}`)
|
history.pushState(null, '', `/projects/${id}`)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -110,10 +113,12 @@ async function onDeleteProject(id: string) {
|
|||||||
@select="onSelectProject"
|
@select="onSelectProject"
|
||||||
@create="onStartCreate"
|
@create="onStartCreate"
|
||||||
@delete="onDeleteProject"
|
@delete="onDeleteProject"
|
||||||
|
@openKb="showKb = true; selectedProjectId = ''; creating = false"
|
||||||
/>
|
/>
|
||||||
<main class="main-content">
|
<main class="main-content">
|
||||||
<div v-if="error" class="error-banner" @click="error = ''">{{ error }}</div>
|
<div v-if="error" class="error-banner" @click="error = ''">{{ error }}</div>
|
||||||
<div v-if="creating" class="empty-state">
|
<KbEditor v-if="showKb" />
|
||||||
|
<div v-else-if="creating" class="empty-state">
|
||||||
<CreateForm @submit="onConfirmCreate" @cancel="creating = false" />
|
<CreateForm @submit="onConfirmCreate" @cancel="creating = false" />
|
||||||
</div>
|
</div>
|
||||||
<div v-else-if="!selectedProjectId" class="empty-state">
|
<div v-else-if="!selectedProjectId" class="empty-state">
|
||||||
|
|||||||
130
web/src/components/KbEditor.vue
Normal file
130
web/src/components/KbEditor.vue
Normal file
@ -0,0 +1,130 @@
|
|||||||
|
<script setup lang="ts">
|
||||||
|
import { ref, onMounted } from 'vue'
|
||||||
|
import { api } from '../api'
|
||||||
|
|
||||||
|
const content = ref('')
|
||||||
|
const saving = ref(false)
|
||||||
|
const loading = ref(true)
|
||||||
|
const message = ref('')
|
||||||
|
|
||||||
|
onMounted(async () => {
|
||||||
|
try {
|
||||||
|
const kb = await api.getKb()
|
||||||
|
content.value = kb.content
|
||||||
|
} catch (e: any) {
|
||||||
|
message.value = 'Failed to load: ' + e.message
|
||||||
|
} finally {
|
||||||
|
loading.value = false
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
async function save() {
|
||||||
|
saving.value = true
|
||||||
|
message.value = ''
|
||||||
|
try {
|
||||||
|
await api.putKb(content.value)
|
||||||
|
message.value = 'Saved & indexed'
|
||||||
|
setTimeout(() => { message.value = '' }, 2000)
|
||||||
|
} catch (e: any) {
|
||||||
|
message.value = 'Error: ' + e.message
|
||||||
|
} finally {
|
||||||
|
saving.value = false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<template>
|
||||||
|
<div class="kb-view">
|
||||||
|
<div class="kb-header">
|
||||||
|
<h2>Knowledge Base</h2>
|
||||||
|
<div class="kb-actions">
|
||||||
|
<span v-if="message" class="kb-message">{{ message }}</span>
|
||||||
|
<button class="btn-save" @click="save" :disabled="saving">
|
||||||
|
{{ saving ? 'Saving...' : 'Save' }}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div v-if="loading" class="kb-loading">Loading...</div>
|
||||||
|
<textarea
|
||||||
|
v-else
|
||||||
|
v-model="content"
|
||||||
|
class="kb-textarea"
|
||||||
|
placeholder="Write your knowledge base in Markdown... Use ## headings to split into searchable chunks."
|
||||||
|
spellcheck="false"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<style scoped>
|
||||||
|
.kb-view {
|
||||||
|
flex: 1;
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
overflow: hidden;
|
||||||
|
}
|
||||||
|
|
||||||
|
.kb-header {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: space-between;
|
||||||
|
padding: 16px 20px;
|
||||||
|
border-bottom: 1px solid var(--border);
|
||||||
|
}
|
||||||
|
|
||||||
|
.kb-header h2 {
|
||||||
|
font-size: 16px;
|
||||||
|
font-weight: 600;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.kb-actions {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 12px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.kb-message {
|
||||||
|
font-size: 12px;
|
||||||
|
color: var(--accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-save {
|
||||||
|
padding: 6px 16px;
|
||||||
|
background: var(--accent);
|
||||||
|
color: var(--bg-primary);
|
||||||
|
border: none;
|
||||||
|
border-radius: 6px;
|
||||||
|
font-size: 13px;
|
||||||
|
cursor: pointer;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-save:disabled {
|
||||||
|
opacity: 0.5;
|
||||||
|
}
|
||||||
|
|
||||||
|
.kb-loading {
|
||||||
|
flex: 1;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
color: var(--text-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.kb-textarea {
|
||||||
|
flex: 1;
|
||||||
|
padding: 20px;
|
||||||
|
background: var(--bg-primary);
|
||||||
|
color: var(--text-primary);
|
||||||
|
border: none;
|
||||||
|
resize: none;
|
||||||
|
font-family: 'JetBrains Mono', monospace;
|
||||||
|
font-size: 14px;
|
||||||
|
line-height: 1.6;
|
||||||
|
outline: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.kb-textarea::placeholder {
|
||||||
|
color: var(--text-secondary);
|
||||||
|
opacity: 0.5;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
@ -1,4 +1,5 @@
|
|||||||
<script setup lang="ts">
|
<script setup lang="ts">
|
||||||
|
import { ref } from 'vue'
|
||||||
import type { Project } from '../types'
|
import type { Project } from '../types'
|
||||||
|
|
||||||
defineProps<{
|
defineProps<{
|
||||||
@ -10,14 +11,22 @@ const emit = defineEmits<{
|
|||||||
select: [id: string]
|
select: [id: string]
|
||||||
create: []
|
create: []
|
||||||
delete: [id: string]
|
delete: [id: string]
|
||||||
|
openKb: []
|
||||||
}>()
|
}>()
|
||||||
|
|
||||||
|
const showSettings = ref(false)
|
||||||
|
|
||||||
function onDelete(e: Event, id: string) {
|
function onDelete(e: Event, id: string) {
|
||||||
e.stopPropagation()
|
e.stopPropagation()
|
||||||
if (confirm('确定删除这个项目?')) {
|
if (confirm('确定删除这个项目?')) {
|
||||||
emit('delete', id)
|
emit('delete', id)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function onOpenKb() {
|
||||||
|
showSettings.value = false
|
||||||
|
emit('openKb')
|
||||||
|
}
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<template>
|
<template>
|
||||||
@ -41,6 +50,14 @@ function onDelete(e: Event, id: string) {
|
|||||||
<span class="project-time">{{ new Date(project.updated_at).toLocaleDateString() }}</span>
|
<span class="project-time">{{ new Date(project.updated_at).toLocaleDateString() }}</span>
|
||||||
</div>
|
</div>
|
||||||
</nav>
|
</nav>
|
||||||
|
<div class="sidebar-footer">
|
||||||
|
<div class="settings-wrapper">
|
||||||
|
<button class="btn-settings" @click="showSettings = !showSettings">Settings</button>
|
||||||
|
<div v-if="showSettings" class="settings-menu">
|
||||||
|
<button class="settings-item" @click="onOpenKb">Knowledge Base</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
</aside>
|
</aside>
|
||||||
</template>
|
</template>
|
||||||
|
|
||||||
@ -153,4 +170,59 @@ function onDelete(e: Event, id: string) {
|
|||||||
font-size: 11px;
|
font-size: 11px;
|
||||||
color: var(--text-secondary);
|
color: var(--text-secondary);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.sidebar-footer {
|
||||||
|
padding: 12px 16px;
|
||||||
|
border-top: 1px solid var(--border);
|
||||||
|
}
|
||||||
|
|
||||||
|
.settings-wrapper {
|
||||||
|
position: relative;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-settings {
|
||||||
|
width: 100%;
|
||||||
|
padding: 8px;
|
||||||
|
background: transparent;
|
||||||
|
color: var(--text-secondary);
|
||||||
|
border: none;
|
||||||
|
font-size: 13px;
|
||||||
|
cursor: pointer;
|
||||||
|
text-align: left;
|
||||||
|
border-radius: 6px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-settings:hover {
|
||||||
|
background: var(--bg-tertiary);
|
||||||
|
color: var(--text-primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.settings-menu {
|
||||||
|
position: absolute;
|
||||||
|
bottom: 100%;
|
||||||
|
left: 0;
|
||||||
|
right: 0;
|
||||||
|
margin-bottom: 4px;
|
||||||
|
background: var(--bg-primary);
|
||||||
|
border: 1px solid var(--border);
|
||||||
|
border-radius: 8px;
|
||||||
|
padding: 4px;
|
||||||
|
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.2);
|
||||||
|
}
|
||||||
|
|
||||||
|
.settings-item {
|
||||||
|
width: 100%;
|
||||||
|
padding: 8px 12px;
|
||||||
|
background: none;
|
||||||
|
border: none;
|
||||||
|
color: var(--text-primary);
|
||||||
|
font-size: 13px;
|
||||||
|
text-align: left;
|
||||||
|
cursor: pointer;
|
||||||
|
border-radius: 6px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.settings-item:hover {
|
||||||
|
background: var(--bg-tertiary);
|
||||||
|
}
|
||||||
</style>
|
</style>
|
||||||
|
|||||||
@ -82,7 +82,7 @@ function handleWsMessage(msg: WsMessage) {
|
|||||||
break
|
break
|
||||||
case 'ReportReady':
|
case 'ReportReady':
|
||||||
if (workflow.value && msg.workflow_id === workflow.value.id) {
|
if (workflow.value && msg.workflow_id === workflow.value.id) {
|
||||||
workflow.value = { ...workflow.value, status: workflow.value.status }
|
loadData()
|
||||||
}
|
}
|
||||||
break
|
break
|
||||||
case 'ProjectUpdate':
|
case 'ProjectUpdate':
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user