[Pallas] Add a bfloat16 flash attention test case #6810
Initial commit
92d1f98a
alanwaketan
changed the base branch from
master
to
r2.3
1 year ago
JackCaoG
approved these changes
on 2024-03-22
lsy323
approved these changes
on 2024-03-25
lsy323
merged
3522be16
into r2.3 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub