chat-ui
62ae8a90 - fix: set max_tokens for Qwen3.5-397B-A17B to prevent truncation

Commit
3 days ago
fix: set max_tokens for Qwen3.5-397B-A17B to prevent truncation Together AI defaults to 2048 max_tokens when unset, causing responses to be cut off mid-sentence. Set to 32768 per Qwen's official recommendation for general use.
Author
Committer
Parents
Loading