diffusers
[LoRA] fix `cross_attention_kwargs` problems and tighten tests
#7388
Merged

[LoRA] fix `cross_attention_kwargs` problems and tighten tests #7388

sayakpaul merged 10 commits into main from debug-lora-scale-issue
sayakpaul
sayakpaul debugging
f390f8f3
sayakpaul let's see the numbers
64aee4a0
sayakpaul let's see the numbers
8ed12147
sayakpaul let's see the numbers
d2d47d3f
sayakpaul restrict tolerance.
d6bfd2f3
sayakpaul increase inference steps.
b267bb54
sayakpaul shallow copy of cross_attentionkwargs
03515f00
sayakpaul remove print
f24c5025
sayakpaul sayakpaul requested a review from BenjaminBossan BenjaminBossan 2 years ago
sayakpaul sayakpaul requested a review from yiyixuxu yiyixuxu 2 years ago
sayakpaul
HuggingFaceDocBuilderDev
yiyixuxu
yiyixuxu approved these changes on 2024-03-19
sayakpaul
younesbelkada
younesbelkada approved these changes on 2024-03-19
sayakpaul
sayakpaul Merge branch 'main' into debug-lora-scale-issue
dc7cd6a0
younesbelkada
BenjaminBossan
BenjaminBossan approved these changes on 2024-03-19
sayakpaul Merge branch 'main' into debug-lora-scale-issue
7ab0785a
sayakpaul sayakpaul merged b09a2aa3 into main 2 years ago
sayakpaul sayakpaul deleted the debug-lora-scale-issue branch 2 years ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone