xla
[Pallas] Make a FlashAttention Wrapper
#6785
Merged

[Pallas] Make a FlashAttention Wrapper #6785

alanwaketan merged 6 commits into master from alanwaketan/flash_attention_3
alanwaketan
alanwaketan tmp
0a44bb6e
alanwaketan tmp
373472de
alanwaketan introduce flash_attention
66026e56
alanwaketan Add test case
fe63be99
alanwaketan Fix the test
522de564
alanwaketan Fix linters
c36fb510
alanwaketan alanwaketan requested a review from will-cromar will-cromar 1 year ago
alanwaketan alanwaketan requested a review from JackCaoG JackCaoG 1 year ago
alanwaketan alanwaketan assigned alanwaketan alanwaketan 1 year ago
alanwaketan alanwaketan added backport_2.3
JackCaoG
JackCaoG approved these changes on 2024-03-20
alanwaketan
alanwaketan alanwaketan merged fcf24b6c into master 1 year ago
alanwaketan alanwaketan deleted the alanwaketan/flash_attention_3 branch 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
Labels
Milestone