diffusers
b09a2aa3 - [LoRA] fix `cross_attention_kwargs` problems and tighten tests (#7388)

Commit
1 year ago
[LoRA] fix `cross_attention_kwargs` problems and tighten tests (#7388) * debugging * let's see the numbers * let's see the numbers * let's see the numbers * restrict tolerance. * increase inference steps. * shallow copy of cross_attentionkwargs * remove print
Author
Parents
Loading