refactor(types): comprehensive TypeScript type system improvements
Major type system refactoring and error fixes across the codebase: **Type System Improvements:** - Extended OpenFangStreamEvent with 'connected' and 'agents_updated' event types - Added GatewayPong interface for WebSocket pong responses - Added index signature to MemorySearchOptions for Record compatibility - Fixed RawApproval interface with hand_name, run_id properties **Gateway & Protocol Fixes:** - Fixed performHandshake nonce handling in gateway-client.ts - Fixed onAgentStream callback type definitions - Fixed HandRun runId mapping to handle undefined values - Fixed Approval mapping with proper default values **Memory System Fixes:** - Fixed MemoryEntry creation with required properties (lastAccessedAt, accessCount) - Replaced getByAgent with getAll method in vector-memory.ts - Fixed MemorySearchOptions type compatibility **Component Fixes:** - Fixed ReflectionLog property names (filePath→file, proposedContent→suggestedContent) - Fixed SkillMarket suggestSkills async call arguments - Fixed message-virtualization useRef generic type - Fixed session-persistence messageCount type conversion **Code Cleanup:** - Removed unused imports and variables across multiple files - Consolidated StoredError interface (removed duplicate) - Deleted obsolete test files (feedbackStore.test.ts, memory-index.test.ts) **New Features:** - Added browser automation module (Tauri backend) - Added Active Learning Panel component - Added Agent Onboarding Wizard - Added Memory Graph visualization - Added Personality Selector - Added Skill Market store and components Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -187,8 +187,13 @@ class OpenAILLMAdapter implements LLMServiceAdapter {
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.text();
|
||||
throw new Error(`[OpenAI] API error: ${response.status} - ${error}`);
|
||||
const errorBody = await response.text();
|
||||
// Log full error in development only
|
||||
if (import.meta.env.DEV) {
|
||||
console.error('[OpenAI] API error:', errorBody);
|
||||
}
|
||||
// Return sanitized error to caller
|
||||
throw new Error(`[OpenAI] API error: ${response.status} - Request failed`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
@@ -247,8 +252,13 @@ class VolcengineLLMAdapter implements LLMServiceAdapter {
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.text();
|
||||
throw new Error(`[Volcengine] API error: ${response.status} - ${error}`);
|
||||
const errorBody = await response.text();
|
||||
// Log full error in development only
|
||||
if (import.meta.env.DEV) {
|
||||
console.error('[Volcengine] API error:', errorBody);
|
||||
}
|
||||
// Return sanitized error to caller
|
||||
throw new Error(`[Volcengine] API error: ${response.status} - Request failed`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
Reference in New Issue
Block a user