transformers
277d49a5
- Do not initialize `torch.distributed` process group if one is already initailized (#16487)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
3 years ago
Do not initialize `torch.distributed` process group if one is already initailized (#16487) * Do not initialize torch process group twice * Apply suggestions from code review
References
#16487 - Do not initialize `torch.distributed` process group if one is already initailized
#19449 - [WIP] Fix weights initialization of several vision models
#27720 - Add common processor tests
#29969 - [SigLIP] Add fast tokenizer
#32831 - [Docs] Update resources
#33111 - [Backbone] Remove out_features everywhere
#33174 - [Zero-shot image classification pipeline] Remove tokenizer_kwargs
#39821 - Support MetaCLIP 2
#59 - Fix attention mask handling in EoMT-DINOv3 converter
#62 - Add initial DEIMv2 model implementation
#65 - Fix RTDetrV2 sine position embedding ordering
Author
Yard1
Parents
2b483230
Loading