diffusers
acd6d2c4
- Fix the bug that `joint_attention_kwargs` is not passed to the FLUX's transformer attention processors (#9517)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
Fix the bug that `joint_attention_kwargs` is not passed to the FLUX's transformer attention processors (#9517) * Update transformer_flux.py
References
#9517 - Fix the bug that `joint_attention_kwargs` is not passed to the FLUX's transformer attention processors
Author
HorizonWind2004
Parents
86bd991e
Loading