optimum
42924f8e
- Fix arg in bettertransformer llama attention (#1421)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Fix arg in bettertransformer llama attention (#1421) * fix arg in llama attention * change to kwargs * add kwargs everwhere --------- Co-authored-by: younesbelkada <younesbelkada@gmail.com>
Author
SunMarc
Committer
fxmarty
Parents
69d34a9c
Loading