transformers
0ce24f5a - Fix Causality Handling in Flash Attention to Support Bidirectional Attention (#39707)

Commit
124 days ago
Fix Causality Handling in Flash Attention to Support Bidirectional Attention (#39707) Fix the is_causal logic to enable bidirectional attention Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
Author
Parents
Loading