diffusers
2b760996
- [refactor] apply qk norm in attention processors (#9071)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Hide Minimap (CTRL+M)
Commit
322 days ago
[refactor] apply qk norm in attention processors (#9071) * apply qk norm in attention processors * revert attention processor * qk-norm in only attention proc 2.0 and fused variant
References
#9071 - [refactor] apply qk norm in attention processors
Author
a-r-r-o-w
Parents
4f0d01d3
Files
1
src/diffusers/models
attention_processor.py
Loading