xla
[Pallas] Add a bfloat16 flash attention test case
#6810
Merged

[Pallas] Add a bfloat16 flash attention test case #6810

lsy323 merged 1 commit into r2.3 from alanwaketan/bp_1
alanwaketan
alanwaketan Initial commit
92d1f98a
alanwaketan alanwaketan requested a review from will-cromar will-cromar 1 year ago
alanwaketan alanwaketan requested a review from JackCaoG JackCaoG 1 year ago
alanwaketan alanwaketan requested a review from yeounoh yeounoh 1 year ago
alanwaketan alanwaketan requested a review from mateuszlewko mateuszlewko 1 year ago
alanwaketan alanwaketan requested a review from stgpetrovic stgpetrovic 1 year ago
alanwaketan alanwaketan changed the base branch from master to r2.3 1 year ago
alanwaketan alanwaketan requested a review from lsy323 lsy323 1 year ago
JackCaoG
JackCaoG approved these changes on 2024-03-22
lsy323
lsy323 approved these changes on 2024-03-25
lsy323 lsy323 merged 3522be16 into r2.3 1 year ago
alanwaketan

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone