transformers
75aa7c72
- [ModernBert] Prevent the attention mask from being None in ModernBertForSequenceClassification (#35991)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
167 days ago
[ModernBert] Prevent the attention mask from being None in ModernBertForSequenceClassification (#35991) * [ModernBert] Prevent the attention mask from being None in ModernBertForSequenceClassification * fix the modular conversion
References
#35991 - [ModernBert] Prevent the attention mask from being None in ModernBertForSequenceClassification
Author
ashmikuz
Parents
04b751f0
Loading