[LoRA] fix `cross_attention_kwargs` problems and tighten tests #7388
debugging
f390f8f3
let's see the numbers
64aee4a0
let's see the numbers
8ed12147
let's see the numbers
d2d47d3f
restrict tolerance.
d6bfd2f3
increase inference steps.
b267bb54
shallow copy of cross_attentionkwargs
03515f00
remove print
f24c5025
yiyixuxu
approved these changes
on 2024-03-19
Merge branch 'main' into debug-lora-scale-issue
dc7cd6a0
Merge branch 'main' into debug-lora-scale-issue
7ab0785a
sayakpaul
merged
b09a2aa3
into main 2 years ago
sayakpaul
deleted the debug-lora-scale-issue branch 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub