transformers
4d806dba
- Fix bug of _prepare_4d_attention_mask (#27847)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Fix bug of _prepare_4d_attention_mask (#27847) * use _prepare_4d_attention_mask * fix comment
References
#27847 - Fix bug of _prepare_4d_attention_mask
Author
jiqing-feng
Parents
75336c17
Loading