[Pallas] Support Flash Attention #6658
alanwaketan
changed the title [WIP] Support Pallas Flash Attention [Pallas] Support Flash Attention 1 year ago
alanwaketan
marked this pull request as ready for review 1 year ago
tmp
406224aa
Revert some changes
6e8f05d0
Revert more changes
ebcfca96
Turn off TPU layout by default
d24a2220
Improve the test case
cb0a1bb5
Turn tpu layout back on
f424e2e9
Reenable torchvision
1031d1d6
Fix linters
c8051095
Fix more liners
e60f07b5
alanwaketan
force pushed
from
37cf282f
to
e60f07b5
1 year ago
qihqi
approved these changes
on 2024-03-06
Login to write a write a comment.
Login via GitHub