xla
[Pallas] Make FlashAttention as torch.autograd.Function
#6886
Merged

[Pallas] Make FlashAttention as torch.autograd.Function #6886

alanwaketan merged 9 commits into master from alanwaketan/fa_autograd
alanwaketan
alanwaketan alanwaketan requested a review from lsy323 lsy323 1 year ago
alanwaketan alanwaketan requested a review from JackCaoG JackCaoG 1 year ago
alanwaketan alanwaketan assigned alanwaketan alanwaketan 1 year ago
alanwaketan Initial commit
d18e1082
alanwaketan Fix the test
39e1dead
alanwaketan Add a testcase somewhat
675c8b34
alanwaketan Fix linters
02a15f0b
alanwaketan alanwaketan force pushed from 5b316baa to 02a15f0b 1 year ago
alanwaketan Fix the test
f58c8e78
JackCaoG
JackCaoG commented on 2024-04-04
JackCaoG
JackCaoG commented on 2024-04-04
alanwaketan
alanwaketan commented on 2024-04-04
JackCaoG
JackCaoG commented on 2024-04-04
alanwaketan Imporve lmi
3cc5271b
alanwaketan Fix linters
36027f0f
alanwaketan Address commetns
2c71054b
alanwaketan Minor fix
dcb859c5
JackCaoG
JackCaoG approved these changes on 2024-04-04
alanwaketan alanwaketan merged 0c704cf8 into master 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
Labels
Milestone