transformers
0040469b
- Correct attention mask dtype for Flax GPT2 (#25636)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Correct attention mask dtype for Flax GPT2 (#25636) * Correct attention mask dtype * reformat code * add a test for boolean mask * convert test to fast test * delete unwanted print * use assertTrue for testing
References
#25636 - Correct attention mask dtype for Flax GPT2
Author
liutianlin0121
Parents
4b796978
Loading