transformers
eaace0c6
- Optimize by not computing gradients for parameters set to requires_grad=False (#21236)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
3 years ago
Optimize by not computing gradients for parameters set to requires_grad=False (#21236) * Optimize by not computing gradients for parameters set to requires_grad=False * Make change to retrigger the build * Fix isort issue * Fix issue
References
#21236 - Optimize by not computing gradients for parameters set to requires_grad=False
#27720 - Add common processor tests
#29969 - [SigLIP] Add fast tokenizer
#32831 - [Docs] Update resources
#33111 - [Backbone] Remove out_features everywhere
#33174 - [Zero-shot image classification pipeline] Remove tokenizer_kwargs
#39821 - Support MetaCLIP 2
#59 - Fix attention mask handling in EoMT-DINOv3 converter
#41212 - Add EoMT with DINOv3 backbone
#62 - Add initial DEIMv2 model implementation
Author
raghavanone
Parents
6e4d3f08
Loading