transformers
d03a3ca6
- [`OPT`] Fix attention scaling (#38290)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
202 days ago
[`OPT`] Fix attention scaling (#38290) * fix opt attention scaling * add comment to why we do this
References
#38290 - [`OPT`] Fix attention scaling
Author
vasqu
Parents
a5a0c7b8
Loading