xla
[Pallas] Support Flash Attention
#6658
Merged

[Pallas] Support Flash Attention #6658

alanwaketan merged 9 commits into master from alanwaketan/flash_attention
alanwaketan
alanwaketan alanwaketan changed the title [WIP] Support Pallas Flash Attention [Pallas] Support Flash Attention 1 year ago
alanwaketan alanwaketan assigned alanwaketan alanwaketan 1 year ago
alanwaketan alanwaketan marked this pull request as ready for review 1 year ago
alanwaketan alanwaketan requested a review from JackCaoG JackCaoG 1 year ago
alanwaketan alanwaketan requested a review from qihqi qihqi 1 year ago
alanwaketan tmp
406224aa
alanwaketan Revert some changes
6e8f05d0
alanwaketan Revert more changes
ebcfca96
alanwaketan Turn off TPU layout by default
d24a2220
alanwaketan Improve the test case
cb0a1bb5
alanwaketan Turn tpu layout back on
f424e2e9
alanwaketan Reenable torchvision
1031d1d6
alanwaketan Fix linters
c8051095
alanwaketan Fix more liners
e60f07b5
alanwaketan alanwaketan force pushed from 37cf282f to e60f07b5 1 year ago
qihqi
qihqi approved these changes on 2024-03-06
alanwaketan
alanwaketan alanwaketan merged 9d4dcae7 into master 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
Labels
Milestone