xla
1ccd6a64
- [Pallas] Add a bfloat16 flash attention test case (#6748)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
[Pallas] Add a bfloat16 flash attention test case (#6748) Summary: Add a bfloat16 flash attention test case. Test Plan: python test/test_pallas.py
References
#6748 - [Pallas] Add a bfloat16 flash attention test case
Author
alanwaketan
Parents
7cf9f10a
Loading