xla
[Pallas] Add a bfloat16 flash attention test case
#6748
Merged

[Pallas] Add a bfloat16 flash attention test case #6748

alanwaketan merged 4 commits into master from alanwaketan/flash_attention_2
alanwaketan
alanwaketan alanwaketan added backport_2.3
alanwaketan alanwaketan requested a review from JackCaoG JackCaoG 1 year ago
alanwaketan alanwaketan assigned alanwaketan alanwaketan 1 year ago
JackCaoG
JackCaoG approved these changes on 2024-03-14
JackCaoG
alanwaketan alanwaketan force pushed from 35b53136 to f54ede42 1 year ago
alanwaketan
alanwaketan alanwaketan force pushed from f54ede42 to c0497043 1 year ago
alanwaketan Initial commit
5c0e9bb3
alanwaketan Fix linters
6092ffc3
alanwaketan alanwaketan force pushed from c0497043 to 6092ffc3 1 year ago
alanwaketan skip tpu v2
8f22ce34
alanwaketan fix linter
702b327d
alanwaketan alanwaketan merged 1ccd6a64 into master 1 year ago
alanwaketan alanwaketan deleted the alanwaketan/flash_attention_2 branch 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
Labels
Milestone