transformers
0ce24f5a
- Fix Causality Handling in Flash Attention to Support Bidirectional Attention (#39707)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
124 days ago
Fix Causality Handling in Flash Attention to Support Bidirectional Attention (#39707) Fix the is_causal logic to enable bidirectional attention Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
References
#39707 - Fix Causality Handling in Flash Attention to Support Bidirectional Attention
Author
lucaswychan
Parents
83dbebc4
Loading