[Core] LoRA improvements pt. 3 #4842
throw warning when more than one lora is attempted to be fused.
91b34abb
introduce support of lora scale during fusion.
c9eeb788
change test name
37692b12
changes
cfd19a57
change to _lora_scale
8a9dad00
lora_scale to call whenever applicable.
ed3b37ad
debugging
b86a8f6d
lora_scale additional.
80839d63
cross_attention_kwargs
2ed1f2a4
lora_scale -> scale.
3967da86
lora_scale fix
e24fd70a
lora_scale in patched projection.
21e765b2
debugging
9678ed29
debugging
acbbb4d8
debugging
0c2dad46
debugging
62694120
debugging
cc0c7ec0
debugging
b357ffcf
debugging
8495b434
debugging
96fc1afb
debugging
0c501d3d
debugging
016d3e95
debugging
910e96b4
debugging
de159dad
debugging
4ee8dbfe
styling.
cd9ac470
debugging
1cd983fb
debugging
860a374d
debugging
1b2346ca
debugging
583da5f0
debugging
77f64591
debugging
ec67361c
debugging
6c9c5dc5
debugging
d7b35d46
debugging
e601d2b9
debugging
35148d07
debugging
55efe9cd
debugging
0d7b3df0
remove unneeded prints.
cdc79631
remove unneeded prints.
2a3e358b
assign cross_attention_kwargs.
42c2c0ae
debugging
98e6eca9
debugging
03abb4c0
debugging
32a175fb
debugging
ef1ad841
debugging
9a759b96
debugging
369a53f0
debugging
833fd358
debugging
d8371ab1
debugging
a5925ab6
debugging
d3d6ab11
debugging
8c0b5846
debugging
43d6c8d6
debugging
caa86251
debugging
38cbe461
debugging
b29e025a
debugging
b2759470
debugging
d8b4bf71
debugging
a3df6cd8
debugging
265d5f4c
Merge branch 'main' into lora-improvements-pt3
00167bed
clean up.
7d348840
sayakpaul
marked this pull request as ready for review 2 years ago
refactor scale retrieval logic a bit.
9dee7d4f
fix nonetypw
f81f77d5
fix: tests
92e1194f
add more tests
4511f48e
sayakpaul
changed the title [WIP] [Core] LoRA improvements pt. 3 [Core] LoRA improvements pt. 3 2 years ago
more fixes.
6667e686
figure out a way to pass lora_scale.
b941b88d
Apply suggestions from code review
9705cc28
unify the retrieval logic of lora_scale.
bebab129
move adjust_lora_scale_text_encoder to lora.py.
81f7ddf9
introduce dynamic adjustment lora scale support to sd
e2c835c4
Merge branch 'main' into lora-improvements-pt3
ca48db69
fix up copies
f2026acb
Empty-Commit
74448966
add: test to check fusion equivalence on different scales.
e60f4509
handle lora fusion warning.
bf1052b7
make lora smaller
47333846
make lora smaller
dabdd58c
make lora smaller
51824c74
Merge branch 'main' into lora-improvements-pt3
972c8e86
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub