[Pallas] Make a FlashAttention Wrapper #6785
tmp
0a44bb6e
tmp
373472de
introduce flash_attention
66026e56
Add test case
fe63be99
Fix the test
522de564
Fix linters
c36fb510
JackCaoG
approved these changes
on 2024-03-20
alanwaketan
deleted the alanwaketan/flash_attention_3 branch 1 year ago
Login to write a write a comment.
Login via GitHub