transformers
026a5d08
- [T5 fp16] Fix fp16 in T5 (#4436)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
5 years ago
[T5 fp16] Fix fp16 in T5 (#4436) * fix fp16 in t5 * make style * refactor invert_attention_mask fn * fix typo
References
#4436 - [T5 fp16] Fix fp16 in T5
Author
patrickvonplaten
Parents
fa6113f9
Loading