fix: migrate glm-4-flash to glm-4-flash-250414 (model deprecated by Zhipu)
Some checks failed
CI / Lint & TypeCheck (push) Has been cancelled
CI / Unit Tests (push) Has been cancelled
CI / Build Frontend (push) Has been cancelled
CI / Rust Check (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
CI / E2E Tests (push) Has been cancelled
Some checks failed
CI / Lint & TypeCheck (push) Has been cancelled
CI / Unit Tests (push) Has been cancelled
CI / Build Frontend (push) Has been cancelled
CI / Rust Check (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
CI / E2E Tests (push) Has been cancelled
Zhipu AI has deprecated glm-4-flash, causing 404 errors on all chat requests. Updated all references: - config: glm-4-flash → glm-4-flash-250414, added glm-z1-flash - frontend: defaultModel, conversationStore, ChatArea fallback, ModelsAPI
This commit is contained in:
@@ -192,7 +192,7 @@ export const useConversationStore = create<ConversationState>()(
|
||||
agents: [DEFAULT_AGENT],
|
||||
currentAgent: DEFAULT_AGENT,
|
||||
sessionKey: null,
|
||||
currentModel: 'glm-4-flash',
|
||||
currentModel: 'glm-4-flash-250414',
|
||||
|
||||
newConversation: (currentMessages: ChatMessage[]) => {
|
||||
const state = get();
|
||||
|
||||
Reference in New Issue
Block a user