transformers
34678db4 - Fix Seq2seqTrainer decoder attention mask (#26841)

Commit
2 years ago
Fix Seq2seqTrainer decoder attention mask (#26841) Don't drop decoder_input_ids without also dropping decoder_attention_mask
Author
Parents
Loading