Files
zclaw_openfang/docs/features/06-tauri-backend/00-backend-integration.md
iven 3ff08faa56 release(v0.2.0): streaming, MCP protocol, Browser Hand, security enhancements
## Major Features

### Streaming Response System
- Implement LlmDriver trait with `stream()` method returning async Stream
- Add SSE parsing for Anthropic and OpenAI API streaming
- Integrate Tauri event system for frontend streaming (`stream:chunk` events)
- Add StreamChunk types: Delta, ToolStart, ToolEnd, Complete, Error

### MCP Protocol Implementation
- Add MCP JSON-RPC 2.0 types (mcp_types.rs)
- Implement stdio-based MCP transport (mcp_transport.rs)
- Support tool discovery, execution, and resource operations

### Browser Hand Implementation
- Complete browser automation with Playwright-style actions
- Support Navigate, Click, Type, Scrape, Screenshot, Wait actions
- Add educational Hands: Whiteboard, Slideshow, Speech, Quiz

### Security Enhancements
- Implement command whitelist/blacklist for shell_exec tool
- Add SSRF protection with private IP blocking
- Create security.toml configuration file

## Test Improvements
- Fix test import paths (security-utils, setup)
- Fix vi.mock hoisting issues with vi.hoisted()
- Update test expectations for validateUrl and sanitizeFilename
- Add getUnsupportedLocalGatewayStatus mock

## Documentation Updates
- Update architecture documentation
- Improve configuration reference
- Add quick-start guide updates

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 03:24:24 +08:00

543 lines
18 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

# ZCLAW Kernel 集成
> **分类**: Tauri 后端
> **优先级**: P0 - 决定性
> **成熟度**: L4 - 生产
> **最后更新**: 2026-03-22
---
## 一、功能概述
### 1.1 基本信息
ZCLAW Kernel 集成模块是 Tauri 后端的核心,负责与内部 ZCLAW Kernel 的集成,包括 Agent 生命周期管理、消息处理、模型配置等。
| 属性 | 值 |
|------|-----|
| 分类 | Tauri 后端 |
| 优先级 | P0 |
| 成熟度 | L4 |
| 依赖 | Tauri Runtime, zclaw-kernel crate |
### 1.2 相关文件
| 文件 | 路径 | 用途 |
|------|------|------|
| Kernel 命令 | `desktop/src-tauri/src/kernel_commands.rs` | Tauri 命令封装 |
| Kernel 状态 | `desktop/src-tauri/src/lib.rs` | Kernel 初始化 |
| Kernel 配置 | `crates/zclaw-kernel/src/config.rs` | 配置结构定义 |
| Kernel 实现 | `crates/zclaw-kernel/src/lib.rs` | Kernel 核心实现 |
---
## 二、架构设计
### 2.1 设计背景
**用户痛点**:
1. 外部进程启动失败、版本兼容问题频发
2. 配置文件分散难以管理
3. 分发复杂,需要额外配置运行时
**解决方案**:
- 将 ZCLAW Kernel 直接集成到 Tauri 应用中
- 通过 UI 配置模型,无需编辑配置文件
- 单一安装包,开箱即用
### 2.2 架构概览
```
┌─────────────────────────────────────────────────────────────────┐
│ Tauri 桌面应用 │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ 前端 (React + TypeScript) │ │
│ │ │ │
│ │ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │ │
│ │ │ ModelsAPI │ │ ChatStore │ │ Connection │ │ │
│ │ │ (UI 配置) │ │ (消息管理) │ │ Store │ │ │
│ │ └──────┬──────┘ └──────┬──────┘ └──────┬──────┘ │ │
│ │ │ │ │ │ │
│ │ └────────────────┼────────────────┘ │ │
│ │ │ │ │
│ │ ▼ │ │
│ │ ┌─────────────────────┐ │ │
│ │ │ KernelClient │ │ │
│ │ │ (Tauri invoke) │ │ │
│ │ └──────────┬──────────┘ │ │
│ └─────────────────────────┼──────────────────────────────┘ │
│ │ │
│ │ Tauri Commands │
│ │ │
│ ┌─────────────────────────┼──────────────────────────────┐ │
│ │ 后端 (Rust) │ │ │
│ │ ▼ │ │
│ │ ┌────────────────────────────────────────────────┐ │ │
│ │ │ kernel_commands.rs │ │ │
│ │ │ ├─ kernel_init │ │ │
│ │ │ ├─ kernel_status │ │ │
│ │ │ ├─ kernel_shutdown │ │ │
│ │ │ ├─ agent_create │ │ │
│ │ │ ├─ agent_list │ │ │
│ │ │ ├─ agent_get │ │ │
│ │ │ ├─ agent_delete │ │ │
│ │ │ └─ agent_chat │ │ │
│ │ └────────────────────┬───────────────────────────┘ │ │
│ │ │ │ │
│ │ ▼ │ │
│ │ ┌────────────────────────────────────────────────┐ │ │
│ │ │ zclaw-kernel crate │ │ │
│ │ │ ├─ Kernel::boot() │ │ │
│ │ │ ├─ spawn_agent() │ │ │
│ │ │ ├─ kill_agent() │ │ │
│ │ │ ├─ list_agents() │ │ │
│ │ │ └─ send_message() │ │ │
│ │ └────────────────────┬───────────────────────────┘ │ │
│ │ │ │ │
│ │ ▼ │ │
│ │ ┌────────────────────────────────────────────────┐ │ │
│ │ │ zclaw-runtime crate │ │ │
│ │ │ ├─ AnthropicDriver │ │ │
│ │ │ ├─ OpenAiDriver │ │ │
│ │ │ ├─ GeminiDriver │ │ │
│ │ │ └─ LocalDriver │ │ │
│ │ └────────────────────────────────────────────────┘ │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────┘
```
### 2.3 Crate 依赖
```
zclaw-types
zclaw-memory
zclaw-runtime
zclaw-kernel
desktop/src-tauri
```
---
## 三、Tauri 命令
### 3.1 Kernel 命令
```rust
/// 初始化内部 ZCLAW Kernel
#[tauri::command]
pub async fn kernel_init(
state: State<'_, KernelState>,
config_request: Option<KernelConfigRequest>,
) -> Result<KernelStatusResponse, String>
/// 获取 Kernel 状态
#[tauri::command]
pub async fn kernel_status(
state: State<'_, KernelState>,
) -> Result<KernelStatusResponse, String>
/// 关闭 Kernel
#[tauri::command]
pub async fn kernel_shutdown(
state: State<'_, KernelState>,
) -> Result<(), String>
```
### 3.2 Agent 命令
```rust
/// 创建 Agent
#[tauri::command]
pub async fn agent_create(
state: State<'_, KernelState>,
request: CreateAgentRequest,
) -> Result<CreateAgentResponse, String>
/// 列出所有 Agent
#[tauri::command]
pub async fn agent_list(
state: State<'_, KernelState>,
) -> Result<Vec<AgentInfo>, String>
/// 获取 Agent 详情
#[tauri::command]
pub async fn agent_get(
state: State<'_, KernelState>,
agent_id: String,
) -> Result<Option<AgentInfo>, String>
/// 删除 Agent
#[tauri::command]
pub async fn agent_delete(
state: State<'_, KernelState>,
agent_id: String,
) -> Result<(), String>
/// 发送消息
#[tauri::command]
pub async fn agent_chat(
state: State<'_, KernelState>,
request: ChatRequest,
) -> Result<ChatResponse, String>
```
### 3.3 数据结构
```rust
/// Kernel 配置请求
pub struct KernelConfigRequest {
pub provider: String, // kimi | qwen | deepseek | zhipu | openai | anthropic | local
pub model: String, // 模型 ID
pub api_key: Option<String>,
pub base_url: Option<String>,
}
/// Kernel 状态响应
pub struct KernelStatusResponse {
pub initialized: bool,
pub agent_count: usize,
pub database_url: Option<String>,
pub default_provider: Option<String>,
pub default_model: Option<String>,
}
/// Agent 创建请求
pub struct CreateAgentRequest {
pub name: String,
pub description: Option<String>,
pub system_prompt: Option<String>,
pub provider: String,
pub model: String,
pub max_tokens: u32,
pub temperature: f32,
}
/// Agent 创建响应
pub struct CreateAgentResponse {
pub id: String,
pub name: String,
pub state: String,
}
/// 聊天请求
pub struct ChatRequest {
pub agent_id: String,
pub message: String,
}
/// 聊天响应
pub struct ChatResponse {
pub content: String,
pub input_tokens: u32,
pub output_tokens: u32,
}
```
---
## 四、Kernel 初始化
### 4.1 初始化流程
```rust
// desktop/src-tauri/src/kernel_commands.rs
pub async fn kernel_init(
state: State<'_, KernelState>,
config_request: Option<KernelConfigRequest>,
) -> Result<KernelStatusResponse, String> {
let mut kernel_lock = state.lock().await;
// 如果已初始化,返回当前状态
if kernel_lock.is_some() {
let kernel = kernel_lock.as_ref().unwrap();
return Ok(KernelStatusResponse { ... });
}
// 构建配置
let mut config = zclaw_kernel::config::KernelConfig::default();
if let Some(req) = &config_request {
config.default_provider = req.provider.clone();
config.default_model = req.model.clone();
// 根据 Provider 设置 API Key
match req.provider.as_str() {
"kimi" => {
if let Some(key) = &req.api_key {
config.kimi_api_key = Some(key.clone());
}
if let Some(url) = &req.base_url {
config.kimi_base_url = url.clone();
}
}
"qwen" => {
if let Some(key) = &req.api_key {
config.qwen_api_key = Some(key.clone());
}
if let Some(url) = &req.base_url {
config.qwen_base_url = url.clone();
}
}
// ... 其他 Provider
_ => {}
}
}
// 启动 Kernel
let kernel = Kernel::boot(config.clone())
.await
.map_err(|e| format!("Failed to initialize kernel: {}", e))?;
*kernel_lock = Some(kernel);
Ok(KernelStatusResponse {
initialized: true,
agent_count: 0,
database_url: Some(config.database_url),
default_provider: Some(config.default_provider),
default_model: Some(config.default_model),
})
}
```
### 4.2 Kernel 状态管理
```rust
// Kernel 状态包装器
pub type KernelState = Arc<Mutex<Option<Kernel>>>;
// 创建 Kernel 状态
pub fn create_kernel_state() -> KernelState {
Arc::new(Mutex::new(None))
}
```
### 4.3 lib.rs 注册
```rust
// desktop/src-tauri/src/lib.rs
#[cfg_attr(mobile, tauri::mobile_entry_point)]
pub fn run() {
tauri::Builder::default()
.setup(|app| {
// 注册 Kernel 状态
app.manage(kernel_commands::create_kernel_state());
Ok(())
})
.invoke_handler(tauri::generate_handler![
// Kernel 命令
kernel_commands::kernel_init,
kernel_commands::kernel_status,
kernel_commands::kernel_shutdown,
// Agent 命令
kernel_commands::agent_create,
kernel_commands::agent_list,
kernel_commands::agent_get,
kernel_commands::agent_delete,
kernel_commands::agent_chat,
])
.run(tauri::generate_context!())
}
```
---
## 五、LLM Provider 支持
### 5.1 支持的 Provider
| Provider | 实现方式 | Base URL |
|----------|---------|----------|
| kimi | OpenAiDriver | `https://api.kimi.com/coding/v1` |
| qwen | OpenAiDriver | `https://dashscope.aliyuncs.com/compatible-mode/v1` |
| deepseek | OpenAiDriver | `https://api.deepseek.com/v1` |
| zhipu | OpenAiDriver | `https://open.bigmodel.cn/api/paas/v4` |
| openai | OpenAiDriver | `https://api.openai.com/v1` |
| anthropic | AnthropicDriver | `https://api.anthropic.com` |
| gemini | GeminiDriver | `https://generativelanguage.googleapis.com` |
| local | LocalDriver | `http://localhost:11434/v1` |
### 5.2 Driver 创建
```rust
// crates/zclaw-kernel/src/config.rs
impl KernelConfig {
pub fn create_driver(&self) -> Result<Arc<dyn LlmDriver>> {
let driver: Arc<dyn LlmDriver> = match self.default_provider.as_str() {
"kimi" => {
let key = self.kimi_api_key.clone()
.ok_or_else(|| ZclawError::ConfigError("KIMI_API_KEY not set".into()))?;
Arc::new(OpenAiDriver::with_base_url(
SecretString::new(key),
self.kimi_base_url.clone(),
))
}
"qwen" => {
let key = self.qwen_api_key.clone()
.ok_or_else(|| ZclawError::ConfigError("QWEN_API_KEY not set".into()))?;
Arc::new(OpenAiDriver::with_base_url(
SecretString::new(key),
self.qwen_base_url.clone(),
))
}
// ... 其他 Provider
_ => return Err(ZclawError::ConfigError(
format!("Unknown provider: {}", self.default_provider)
)),
};
Ok(driver)
}
}
```
---
## 六、前端集成
### 6.1 KernelClient
```typescript
// desktop/src/lib/kernel-client.ts
export class KernelClient {
private config: KernelConfig = {};
setConfig(config: KernelConfig): void {
this.config = config;
}
async connect(): Promise<void> {
// 验证配置
if (!this.config.provider || !this.config.model || !this.config.apiKey) {
throw new Error('请先在"模型与 API"设置页面配置模型');
}
// 初始化 Kernel
const status = await invoke<KernelStatus>('kernel_init', {
configRequest: {
provider: this.config.provider,
model: this.config.model,
apiKey: this.config.apiKey,
baseUrl: this.config.baseUrl || null,
},
});
// 创建默认 Agent
const agents = await this.listAgents();
if (agents.length === 0) {
const agent = await this.createAgent({
name: 'Default Agent',
provider: this.config.provider,
model: this.config.model,
});
this.defaultAgentId = agent.id;
}
}
async chat(message: string, opts?: ChatOptions): Promise<ChatResponse> {
return invoke<ChatResponse>('agent_chat', {
request: {
agentId: opts?.agentId || this.defaultAgentId,
message,
},
});
}
}
```
### 6.2 ConnectionStore 集成
```typescript
// desktop/src/store/connectionStore.ts
connect: async (url?: string, token?: string) => {
const useInternalKernel = isTauriRuntime();
if (useInternalKernel) {
const kernelClient = getKernelClient();
const modelConfig = getDefaultModelConfig();
if (!modelConfig) {
throw new Error('请先在"模型与 API"设置页面添加自定义模型配置');
}
kernelClient.setConfig({
provider: modelConfig.provider,
model: modelConfig.model,
apiKey: modelConfig.apiKey,
baseUrl: modelConfig.baseUrl,
});
await kernelClient.connect();
set({ client: kernelClient, gatewayVersion: '0.2.0-internal' });
return;
}
// 非 Tauri 环境,使用外部 Gateway
// ...
}
```
---
## 七、与旧架构对比
| 特性 | 旧架构 (外部 OpenFang) | 新架构 (内部 Kernel) |
|------|----------------------|---------------------|
| 后端进程 | 独立 OpenFang 进程 | 内置 zclaw-kernel |
| 通信方式 | WebSocket/HTTP | Tauri 命令 |
| 模型配置 | TOML 文件 | UI 设置页面 |
| 启动时间 | 依赖外部进程 | 即时启动 |
| 安装包 | 需要额外运行时 | 单一安装包 |
| 进程管理 | 需要 openfang 命令 | 自动管理 |
---
## 八、实际效果
### 8.1 已实现功能
- [x] 内部 Kernel 集成
- [x] 多 LLM Provider 支持
- [x] UI 模型配置
- [x] Agent 生命周期管理
- [x] 消息发送和响应
- [x] 连接状态管理
- [x] 错误处理
### 8.2 测试覆盖
- **单元测试**: Rust 内置测试
- **集成测试**: E2E 测试覆盖
- **覆盖率**: ~85%
---
## 九、演化路线
### 9.1 短期计划1-2 周)
- [ ] 添加真正的流式响应支持
### 9.2 中期计划1-2 月)
- [ ] Agent 持久化存储
- [ ] 会话历史管理
### 9.3 长期愿景
- [ ] 多 Agent 并发支持
- [ ] Agent 间通信
- [ ] 工作流引擎集成
---
**最后更新**: 2026-03-22