diffusers
acd6d2c4 - Fix the bug that `joint_attention_kwargs` is not passed to the FLUX's transformer attention processors (#9517)

Commit
1 year ago
Fix the bug that `joint_attention_kwargs` is not passed to the FLUX's transformer attention processors (#9517) * Update transformer_flux.py
Parents
Loading