xla
[Pallas] Support Flash Attention backward kernels
#6870
Merged

[Pallas] Support Flash Attention backward kernels #6870

alanwaketan merged 4 commits into master from alanwaketan/fa_backward
alanwaketan
alanwaketan support test__flash_attention_impl
1ce54b1a
alanwaketan Support test__flash_attention_bwd_dkv
cf9a1d07
alanwaketan Support test__flash_attention_bwd_dkv
e01dead5
alanwaketan Fix linters
99dc7c0a
alanwaketan alanwaketan requested a review from lsy323 lsy323 1 year ago
alanwaketan alanwaketan requested a review from JackCaoG JackCaoG 1 year ago
alanwaketan alanwaketan assigned alanwaketan alanwaketan 1 year ago
JackCaoG
JackCaoG commented on 2024-04-02
JackCaoG
JackCaoG approved these changes on 2024-04-02
alanwaketan
alanwaketan alanwaketan merged c54367c8 into master 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
Labels
Milestone