fix(relay): audit fixes — abort signal, model selector guard, SSE CRLF, SQL format
Some checks failed
CI / Lint & TypeCheck (push) Has been cancelled
CI / Unit Tests (push) Has been cancelled
CI / Build Frontend (push) Has been cancelled
CI / Rust Check (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
CI / E2E Tests (push) Has been cancelled
Some checks failed
CI / Lint & TypeCheck (push) Has been cancelled
CI / Unit Tests (push) Has been cancelled
CI / Build Frontend (push) Has been cancelled
CI / Rust Check (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
CI / E2E Tests (push) Has been cancelled
Addresses findings from deep code audit:
H-1: Pass abortController.signal to saasClient.chatCompletion() so
user-cancelled streams actually abort the HTTP connection (was only
stopping the read loop, leaving server-side SSE connection open).
H-2: ModelSelector now shows only when (!isTauriRuntime() || isLoggedIn).
Prevents decorative model list in Tauri local kernel mode where model
selection has no effect (violates CLAUDE.md §5.2).
M-1: Normalize CRLF to LF before SSE event boundary parsing (\n\n).
Prevents buffer overflow when behind nginx/CDN with CRLF line endings.
M-2: SQL window_minute comparison uses to_char(NOW()-interval, format)
instead of (NOW()-interval)::TEXT, matching the stored format exactly.
M-3: sort_candidates_by_quota uses same sliding 60s window as select_best_key.
LOW: Fix misleading invalidate_cache doc comment.
This commit is contained in:
@@ -31,6 +31,7 @@ import { ReasoningBlock } from './ai/ReasoningBlock';
|
||||
import { StreamingText } from './ai/StreamingText';
|
||||
import { ChatMode } from './ai/ChatMode';
|
||||
import { ModelSelector } from './ai/ModelSelector';
|
||||
import { isTauriRuntime } from '../lib/tauri-gateway';
|
||||
import { SuggestionChips } from './ai/SuggestionChips';
|
||||
import { PipelineResultPreview } from './pipeline/PipelineResultPreview';
|
||||
import { PresentationContainer } from './presentation/PresentationContainer';
|
||||
@@ -562,7 +563,7 @@ export function ChatArea({ compact, onOpenDetail }: { compact?: boolean; onOpenD
|
||||
}
|
||||
</div>
|
||||
<div className="flex items-center gap-2">
|
||||
{models.length > 0 && (
|
||||
{models.length > 0 && (!isTauriRuntime() || isLoggedIn) && (
|
||||
<ModelSelector
|
||||
models={models.map(m => ({ id: m.id, name: m.name, provider: m.provider }))}
|
||||
currentModel={currentModel}
|
||||
|
||||
@@ -134,7 +134,7 @@ export function createSaaSRelayGatewayClient(
|
||||
if (opts?.plan_mode) body['plan_mode'] = true;
|
||||
if (opts?.subagent_enabled) body['subagent_enabled'] = true;
|
||||
|
||||
const response = await saasClient.chatCompletion(body);
|
||||
const response = await saasClient.chatCompletion(body, abortController.signal);
|
||||
|
||||
if (!response.ok) {
|
||||
const errText = await response.text().catch(() => '');
|
||||
@@ -160,6 +160,9 @@ export function createSaaSRelayGatewayClient(
|
||||
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
|
||||
// Normalize CRLF to LF for SSE spec compliance
|
||||
buffer = buffer.replace(/\r\n/g, '\n');
|
||||
|
||||
// Optimized SSE parsing: split by double-newline (event boundaries)
|
||||
let boundary: number;
|
||||
while ((boundary = buffer.indexOf('\n\n')) !== -1) {
|
||||
|
||||
Reference in New Issue
Block a user