vllm
[feat] Enable mm caching for transformers backend
#21358
Merged

[feat] Enable mm caching for transformers backend #21358

zucchini-nlp
zucchini-nlp dont ask to explicitly disable caching
ea65be5e
zucchini-nlp zucchini-nlp requested a review from hmellor hmellor 144 days ago
zucchini-nlp zucchini-nlp requested a review from WoosukKwon WoosukKwon 144 days ago
zucchini-nlp zucchini-nlp requested a review from robertgshaw2-redhat robertgshaw2-redhat 144 days ago
zucchini-nlp zucchini-nlp requested a review from njhill njhill 144 days ago
zucchini-nlp zucchini-nlp requested a review from ywang96 ywang96 144 days ago
zucchini-nlp zucchini-nlp requested a review from comaniac comaniac 144 days ago
zucchini-nlp zucchini-nlp requested a review from alexm-redhat alexm-redhat 144 days ago
mergify mergify added documentation
mergify mergify added v1
gemini-code-assist
gemini-code-assist commented on 2025-07-22
github-actions
DarkLight1337
DarkLight1337 DarkLight1337 requested a review from Isotr0py Isotr0py 144 days ago
zucchini-nlp
DarkLight1337
DarkLight1337
zucchini-nlp
DarkLight1337
DarkLight1337
zucchini-nlp
zucchini-nlp return hashes
a4290b07
zucchini-nlp zucchini-nlp requested a review from DarkLight1337 DarkLight1337 144 days ago
mergify mergify added multi-modality
zucchini-nlp
Isotr0py
Isotr0py approved these changes on 2025-07-22
Isotr0py Isotr0py enabled auto-merge (squash) 144 days ago
DarkLight1337
DarkLight1337 DarkLight1337 added this to the v0.10.0 milestone 144 days ago
DarkLight1337 DarkLight1337 added ready
Isotr0py
zucchini-nlp zucchini-nlp changed the title [Bugfix] mm caching isn't tied to prefix caching [feat] Enable mm caching for transformers backend 144 days ago
vllm-bot vllm-bot merged f38ee34a into main 144 days ago
DarkLight1337

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone