transformers
a5b226ce - Fix flash attention speed issue (#32028)

Commit
1 year ago
Fix flash attention speed issue (#32028) Add the lru_cache for speed
Author
Parents
Loading