transformers
a48d68c6
- Fix some models cache initialization (#42586)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
131 days ago
Fix some models cache initialization (#42586) * Create cache when training in case generate needs being called * Align modular * fixes * cohere * fix modular * fix * review --------- Co-authored-by: Cyril Vallez <cyril.vallez@gmail.com>
References
#42586 - Fix some models cache initialization
Author
albertvillanova
Parents
9e82c779
Loading