transformers
aebca696 - Fix missing output_attentions in PT/Flax equivalence test (#16271)

Loading