transformers
aebca696 - Fix missing output_attentions in PT/Flax equivalence test (#16271)

Commit
3 years ago
Fix missing output_attentions in PT/Flax equivalence test (#16271) * fix - set output_attentions to True * Update tests/test_modeling_flax_common.py * update for has_attentions * overwrite check_outputs in FlaxBigBirdModelTest Co-authored-by: ydshieh <ydshieh@users.noreply.github.com> Co-authored-by: Suraj Patil <surajp815@gmail.com>
Author
Parents
Loading