diffusers
84bc0e48 - [LoRA] fix `cross_attention_kwargs` problems and tighten tests (#7388)

Commit
1 year ago
[LoRA] fix `cross_attention_kwargs` problems and tighten tests (#7388) * debugging * let's see the numbers * let's see the numbers * let's see the numbers * restrict tolerance. * increase inference steps. * shallow copy of cross_attentionkwargs * remove print
Author
Committer
Parents
Loading