transformers
Fix: Conditionally import `torch.distributed.fsdp` in `trainer_seq2seq.py`
#44507
Merged

Loading