transformers
2d92db84 - `Llama` family, fix `use_cache=False` generation (#30380)

Commit
1 year ago
`Llama` family, fix `use_cache=False` generation (#30380) * nit to make sure cache positions are not sliced * fix other models * nit * style
Author
Parents
Loading