transformers
4faf6752
- Fix Qwen2Audio flash attention mask format for generation (#41843)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
87 days ago
Fix Qwen2Audio flash attention mask format for generation (#41843) * Fix Qwen2Audio flash attention mask format for generation * use create_bidirectional_mask instead * fix * fix * empty * fix
References
#41843 - Fix Qwen2Audio flash attention mask format for generation
Author
Abdennacer-Badaoui
Parents
bb6028cb
Loading