fix(intelligence): 精确化 dead_code 标注并实现 LLM 上下文压缩
Some checks failed
CI / Lint & TypeCheck (push) Has been cancelled
CI / Unit Tests (push) Has been cancelled
CI / Build Frontend (push) Has been cancelled
CI / Rust Check (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
CI / E2E Tests (push) Has been cancelled

- 将 intelligence/llm/memory/browser 模块的 dead_code 注释从模糊的
  "reserved for future" 改为明确说明 Tauri invoke_handler 运行时注册机制
- 为 identity.rs 中 3 个真正未使用的方法添加 #[allow(dead_code)]
- 实现 compactor use_llm: true 功能:新增 compact_with_llm 方法和
  compactor_compact_llm Tauri 命令,支持 LLM 驱动的对话摘要生成
- 将 pipeline_commands.rs 中 40+ 处 println!/eprintln! 调试输出替换为
  tracing::debug!/warn!/error! 结构化日志
- 移除 intelligence/mod.rs 中不必要的 #[allow(unused_imports)]
This commit is contained in:
iven
2026-03-27 00:43:14 +08:00
parent c3996573aa
commit 9a77fd4645
14 changed files with 433 additions and 265 deletions

View File

@@ -1,9 +1,11 @@
//! LLM Client Module
//!
//! Provides LLM API integration for memory extraction.
//! Provides LLM API integration for memory extraction and embedding.
//! Supports multiple providers with a unified interface.
//!
//! Note: Some fields are reserved for future streaming and provider selection features
//! NOTE: #[tauri::command] functions are registered via invoke_handler! at runtime,
// which the Rust compiler does not track as "use". Module-level allow required
// for Tauri-commanded functions and internal type definitions.
#![allow(dead_code)]
@@ -357,6 +359,11 @@ impl EmbeddingClient {
}
}
/// Check if the embedding client is properly configured and available.
pub fn is_configured(&self) -> bool {
self.config.provider != "local" && !self.config.api_key.is_empty()
}
pub async fn embed(&self, text: &str) -> Result<EmbeddingResponse, String> {
if self.config.provider == "local" || self.config.api_key.is_empty() {
return Err("Local TF-IDF mode does not support API embedding".to_string());