`Llama` family, fix `use_cache=False` generation #30380
nit to make sure cache positions are not sliced
6748f646
fix other models
2d553394
nit
ea0e2205
style
deab6cff
gante
approved these changes
on 2024-04-22
ArthurZucker
deleted the fix-llama-no-cache branch 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub