transformers
2935a1be
- Fix fp32_ln for various models (#41605)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
61 days ago
Fix fp32_ln for various models (#41605) * Add is_causal to KosmosTextAttention * Move get target_dtype to be imported elsewhere * Fix fp32 flash attention bug in bark * Fix is_causal in mllama * Fix fp32 issue on StableLM * Fix repo-consistency
References
#41605 - Fix fp32_ln for various models
Author
remi-or
Parents
b9bd8c45
Loading