transformers
33c60a52
- [`T5Gemma`] Fix cross attention cache (#41890)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
34 days ago
[`T5Gemma`] Fix cross attention cache (#41890) * fix * add test * style * added comment
References
#41890 - [`T5Gemma`] Fix cross attention cache
Author
vasqu
Parents
fa22b569
Loading