transformers
2db2c887
- :rotating_light: Modify ModernBERT's default attention implementation to stop using FA (#43764)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
76 days ago
:rotating_light: Modify ModernBERT's default attention implementation to stop using FA (#43764) * Modify ModernBERT's default attention implementation to stop using FA * Style * Revised based on comments * Update doc
References
#43764 - :rotating_light: Modify ModernBERT's default attention implementation to stop using FA
Author
YangKai0616
Parents
0b2900dd
Loading