Support pass kwargs to sd3 custom attention processor #9818
Support pass kwargs to sd3 custom attention processor
e3bed528
Matrix53
force pushed
from
f2ae8a13
to
e3bed528
1 year ago
Merge branch 'main' into support-pass-kwargs-to-sd3-custom-attn-proceā¦
c9531bb3
hlky
commented
on 2024-12-09
fix: set joint_attention_kwargs default as empty dict
9521b749
set default joint_attention_kwargs in transformer_sd3
7208efff
hlky
requested changes
on 2024-12-09
joint_attention_kwargs or {} attention.py
61925512
joint_attention_kwargs or {} transformer_sd3.py
c771d7ec
hlky
approved these changes
on 2024-12-09
hlky
added close-to-merge
Update src/diffusers/models/transformers/transformer_sd3.py
7ede8f29
yiyixuxu
merged
8eb73c87
into main 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub