refactor(types): comprehensive TypeScript type system improvements

Major type system refactoring and error fixes across the codebase:

**Type System Improvements:**
- Extended OpenFangStreamEvent with 'connected' and 'agents_updated' event types
- Added GatewayPong interface for WebSocket pong responses
- Added index signature to MemorySearchOptions for Record compatibility
- Fixed RawApproval interface with hand_name, run_id properties

**Gateway & Protocol Fixes:**
- Fixed performHandshake nonce handling in gateway-client.ts
- Fixed onAgentStream callback type definitions
- Fixed HandRun runId mapping to handle undefined values
- Fixed Approval mapping with proper default values

**Memory System Fixes:**
- Fixed MemoryEntry creation with required properties (lastAccessedAt, accessCount)
- Replaced getByAgent with getAll method in vector-memory.ts
- Fixed MemorySearchOptions type compatibility

**Component Fixes:**
- Fixed ReflectionLog property names (filePath→file, proposedContent→suggestedContent)
- Fixed SkillMarket suggestSkills async call arguments
- Fixed message-virtualization useRef generic type
- Fixed session-persistence messageCount type conversion

**Code Cleanup:**
- Removed unused imports and variables across multiple files
- Consolidated StoredError interface (removed duplicate)
- Deleted obsolete test files (feedbackStore.test.ts, memory-index.test.ts)

**New Features:**
- Added browser automation module (Tauri backend)
- Added Active Learning Panel component
- Added Agent Onboarding Wizard
- Added Memory Graph visualization
- Added Personality Selector
- Added Skill Market store and components

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
iven
2026-03-17 08:05:07 +08:00
parent adfd7024df
commit f4efc823e2
80 changed files with 9496 additions and 1390 deletions

View File

@@ -187,8 +187,13 @@ class OpenAILLMAdapter implements LLMServiceAdapter {
});
if (!response.ok) {
const error = await response.text();
throw new Error(`[OpenAI] API error: ${response.status} - ${error}`);
const errorBody = await response.text();
// Log full error in development only
if (import.meta.env.DEV) {
console.error('[OpenAI] API error:', errorBody);
}
// Return sanitized error to caller
throw new Error(`[OpenAI] API error: ${response.status} - Request failed`);
}
const data = await response.json();
@@ -247,8 +252,13 @@ class VolcengineLLMAdapter implements LLMServiceAdapter {
});
if (!response.ok) {
const error = await response.text();
throw new Error(`[Volcengine] API error: ${response.status} - ${error}`);
const errorBody = await response.text();
// Log full error in development only
if (import.meta.env.DEV) {
console.error('[Volcengine] API error:', errorBody);
}
// Return sanitized error to caller
throw new Error(`[Volcengine] API error: ${response.status} - Request failed`);
}
const data = await response.json();