feat(chat): LLM 动态对话建议 — 替换硬编码关键词匹配
Some checks failed
CI / Lint & TypeCheck (push) Has been cancelled
CI / Unit Tests (push) Has been cancelled
CI / Build Frontend (push) Has been cancelled
CI / Rust Check (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
CI / E2E Tests (push) Has been cancelled

AI 回复结束后,将最近对话发给 LLM 生成 3 个上下文相关的后续问题,
替换原有的"继续深入分析"等泛泛默认建议。

变更:
- llm-service.ts: 添加 suggestions 提示模板 + llmSuggest() 辅助函数
- streamStore.ts: SSE 流式请求 via SaaS relay,response.text() 一次性
  读取避免 Tauri WebView2 ReadableStream 兼容问题,失败降级到关键词
- chatStore.ts: suggestionsLoading 状态镜像
- SuggestionChips.tsx: loading 骨架动画
- ChatArea.tsx: 传递 loading prop
This commit is contained in:
iven
2026-04-23 11:41:50 +08:00
parent 3e78dacef3
commit b56d1a4c34
5 changed files with 113 additions and 34 deletions

View File

@@ -79,6 +79,7 @@ interface ChatState {
totalOutputTokens: number;
chatMode: ChatModeType;
suggestions: string[];
suggestionsLoading: boolean;
addMessage: (message: Message) => void;
updateMessage: (id: string, updates: Partial<Message>) => void;
@@ -111,6 +112,7 @@ export const useChatStore = create<ChatState>()(
isLoading: false,
chatMode: 'thinking' as ChatModeType,
suggestions: [],
suggestionsLoading: false,
totalInputTokens: 0,
totalOutputTokens: 0,
@@ -367,6 +369,7 @@ const unsubStream = useStreamStore.subscribe((state) => {
if (chat.isLoading !== state.isLoading) updates.isLoading = state.isLoading;
if (chat.chatMode !== state.chatMode) updates.chatMode = state.chatMode;
if (chat.suggestions !== state.suggestions) updates.suggestions = state.suggestions;
if (chat.suggestionsLoading !== state.suggestionsLoading) updates.suggestionsLoading = state.suggestionsLoading;
if (Object.keys(updates).length > 0) {
useChatStore.setState(updates);
}