transformers
a5b226ce
- Fix flash attention speed issue (#32028)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
Fix flash attention speed issue (#32028) Add the lru_cache for speed
References
#32028 - Fix flash attention speed issue
Author
Cyrilvallez
Parents
a1844a32
Loading