vllm
[Misc][OpenAI] deprecate max_tokens in favor of new max_completion_tokens field for chat completion endpoint
#9837
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
7
Changes
View On
GitHub
Commits
feat(openai): add new field max_completion_tokens
gcalmettes
committed
1 year ago
feat: use new max_completion_tokens field for chat_completions calls
gcalmettes
committed
1 year ago
lint: apply formatting
gcalmettes
committed
1 year ago
feat: bump openai dependency to version introducing max_completion_tokens
gcalmettes
committed
1 year ago
lint: apply yapf
gcalmettes
committed
1 year ago
chore: reflect max_completion_tokens support for minimal openai lib version
gcalmettes
committed
1 year ago
feat: tie TODO comments to relevant issue for better referencing
gcalmettes
committed
1 year ago
Loading