transformers
`Llama` family, fix `use_cache=False` generation
#30380
Merged

`Llama` family, fix `use_cache=False` generation #30380

ArthurZucker merged 4 commits into main from fix-llama-no-cache
ArthurZucker
ArthurZucker nit to make sure cache positions are not sliced
6748f646
ArthurZucker fix other models
2d553394
ArthurZucker nit
ea0e2205
ArthurZucker style
deab6cff
HuggingFaceDocBuilderDev
ArthurZucker ArthurZucker requested a review from gante gante 1 year ago
gante
gante approved these changes on 2024-04-22
ArthurZucker
ArthurZucker ArthurZucker merged 2d92db84 into main 1 year ago
ArthurZucker ArthurZucker deleted the fix-llama-no-cache branch 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone