chat-ui
999f6162
- Add max_tokens parameter to GLM-4.6V model config
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
20 days ago
Add max_tokens parameter to GLM-4.6V model config Set the max_tokens parameter to 8192 for the zai-org/GLM-4.6V model in both dev and prod environment YAML files to control output length.
Author
gary149
Parents
36d72f12
Loading