transformers
0040469b - Correct attention mask dtype for Flax GPT2 (#25636)

Commit
2 years ago
Correct attention mask dtype for Flax GPT2 (#25636) * Correct attention mask dtype * reformat code * add a test for boolean mask * convert test to fast test * delete unwanted print * use assertTrue for testing
Parents
Loading