xla
3522be16
- [Pallas] Add a bfloat16 flash attention test case (#6810)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
[Pallas] Add a bfloat16 flash attention test case (#6810)
References
#6810 - [Pallas] Add a bfloat16 flash attention test case
Author
alanwaketan
Parents
8240d05b
Loading