transformers
d1eda63f
- [`Padding-Free Attention`] Fix packed FA attention with pos ids only (#42801)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
5 days ago
[`Padding-Free Attention`] Fix packed FA attention with pos ids only (#42801) * fix position ids * style * fix
References
#42801 - [`Padding-Free Attention`] Fix packed FA attention with pos ids only
Author
vasqu
Parents
f5aa90d0
Loading