transformers
2d92db84
- `Llama` family, fix `use_cache=False` generation (#30380)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
`Llama` family, fix `use_cache=False` generation (#30380) * nit to make sure cache positions are not sliced * fix other models * nit * style
References
#30380 - `Llama` family, fix `use_cache=False` generation
Author
ArthurZucker
Parents
f16caf44
Loading