transformers
75aa7c72 - [ModernBert] Prevent the attention mask from being None in ModernBertForSequenceClassification (#35991)

Commit
167 days ago
[ModernBert] Prevent the attention mask from being None in ModernBertForSequenceClassification (#35991) * [ModernBert] Prevent the attention mask from being None in ModernBertForSequenceClassification * fix the modular conversion
Author
Parents
Loading