[Pallas] Add a bfloat16 flash attention test case #6748
JackCaoG
approved these changes
on 2024-03-14
alanwaketan
force pushed
from
35b53136
to
f54ede42
1 year ago
alanwaketan
force pushed
from
f54ede42
to
c0497043
1 year ago
Initial commit
5c0e9bb3
Fix linters
6092ffc3
alanwaketan
force pushed
from
c0497043
to
6092ffc3
1 year ago
skip tpu v2
8f22ce34
fix linter
702b327d
alanwaketan
deleted the alanwaketan/flash_attention_2 branch 1 year ago
Login to write a write a comment.
Login via GitHub