transformers
2db2c887 - :rotating_light: Modify ModernBERT's default attention implementation to stop using FA (#43764)

Commit
76 days ago
:rotating_light: Modify ModernBERT's default attention implementation to stop using FA (#43764) * Modify ModernBERT's default attention implementation to stop using FA * Style * Revised based on comments * Update doc
Author
Parents
Loading