transformers
Fix: Conditionally import `torch.distributed.fsdp` in `trainer_seq2seq.py`
#44507
Merged

Fix: Conditionally import `torch.distributed.fsdp` in `trainer_seq2seq.py` #44507

0xDELUXA
fix: conditionally import torch.distributed.fsdp in trainer_seq2seq
08726390
0xDELUXA 0xDELUXA changed the title fix: conditionally import torch.distributed.fsdp in trainer_seq2seq Fix: Conditionally import `torch.distributed.fsdp` in `trainer_seq2seq.py` 30 days ago
fix: sort imports in trainer_seq2seq.py
0a1c5377
3outeille Merge branch 'main' into fix-torch-distributed-import
9cca87cb
3outeille
3outeille approved these changes on 2026-03-09
3outeille
3outeille Merge branch 'main' into fix-torch-distributed-import
fb0ac7d9
3outeille 3outeille enabled auto-merge (squash) 27 days ago
3outeille 3outeille merged adc0d9ae into main 27 days ago
HuggingFaceDocBuilderDev
0xDELUXA 0xDELUXA deleted the fix-torch-distributed-import branch 23 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone