transformers
ba0dc545
- Add gradient checkpointing to Whisper Flax (#22954)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Add gradient checkpointing to Whisper Flax (#22954) * Add gradient checkpointing to Whisper Flax * self.gradient_checkpointing only needed in nn.Module, removing unnecessary comments
References
#22954 - Add gradient checkpointing to Whisper Flax
#27720 - Add common processor tests
#29969 - [SigLIP] Add fast tokenizer
#32831 - [Docs] Update resources
#33111 - [Backbone] Remove out_features everywhere
#33174 - [Zero-shot image classification pipeline] Remove tokenizer_kwargs
#39821 - Support MetaCLIP 2
#59 - Fix attention mask handling in EoMT-DINOv3 converter
#41212 - Add EoMT with DINOv3 backbone
#62 - Add initial DEIMv2 model implementation
Author
versae
Parents
a72b82eb
Loading