vllm
[Bugfix] Set enforce_eager automatically for mllama
#12127
Merged

[Bugfix] Set enforce_eager automatically for mllama #12127

mgoin merged 2 commits into vllm-project:main from heheda12345:mllama-eager
heheda12345
heheda12345 mllama fallback to eager
f4070a45
github-actions
heheda12345 format
19973ac1
mgoin
mgoin approved these changes on 2025-01-16
mgoin mgoin added ready
mgoin mgoin merged d06e8240 into main 330 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone