transformers
1ed19360
- [`FlexAttention`] Reenable flex for encoder-decoder and make the test more robust (#38321)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
254 days ago
[`FlexAttention`] Reenable flex for encoder-decoder and make the test more robust (#38321) * reenable most flex attention test cases * style * trigger * trigger
References
#38321 - [`FlexAttention`] Reenable flex for encoder-decoder and make the test more robust
Author
vasqu
Parents
bb567d85
Loading