transformers
2ee9f9b6 - Fix computation of attention_probs when head_mask is provided. (#9853)

Commit
4 years ago
Fix computation of attention_probs when head_mask is provided. (#9853) * Fix computation of attention_probs when head_mask is provided. Signed-off-by: Morgan Funtowicz <funtowiczmo@gmail.com> * Apply changes to the template Co-authored-by: Lysandre <lysandre.debut@reseau.eseo.fr>
Author
Parents
Loading