diffusers
[LoRA] fix `cross_attention_kwargs` problems and tighten tests
#7388
Merged

Commits
  • debugging
    sayakpaul committed 2 years ago
  • let's see the numbers
    sayakpaul committed 2 years ago
  • let's see the numbers
    sayakpaul committed 2 years ago
  • let's see the numbers
    sayakpaul committed 2 years ago
  • restrict tolerance.
    sayakpaul committed 2 years ago
  • increase inference steps.
    sayakpaul committed 2 years ago
  • shallow copy of cross_attentionkwargs
    sayakpaul committed 2 years ago
  • remove print
    sayakpaul committed 2 years ago
  • Merge branch 'main' into debug-lora-scale-issue
    sayakpaul committed 2 years ago
  • Merge branch 'main' into debug-lora-scale-issue
    sayakpaul committed 2 years ago
Loading