[Training] Add `datasets` version of LCM LoRA SDXL #5778
add: script to train lcm lora for sdxl with 🤗 datasets
ca7f220b
suit up the args.
88efd154
remove comments.
9e49fd2e
fix num_update_steps
728aa8a6
fix batch unmarshalling
bc8cfddf
fix num_update_steps_per_epoch
8c4d4b6a
fix; dataloading.
6d2f7407
fix microconditions.
c7f28284
unconditional predictions debug
df707545
fix batch size.
dd93227e
no need to use use_auth_token
3d4b1da0
Apply suggestions from code review
79672471
make vae encoding batch size an arg
6b2e42f2
final serialization in kohya
d7f632e6
style
e4edb31b
Merge branch 'main' into lcm-lora-sdxl-datasets
858009b4
state dict rejigging
6aa2dd8c
feat: no separate teacher unet.
1fd33782
debug
41354149
fix state dict serialization
3b066d26
debug
fc5546fe
debug
ba0d0f25
debug
35e30fbb
remove prints.
53c13f7f
remove kohya utility and make style
cff23edb
fix serialization
ca076c78
fix
808f61ea
add test
842df25c
add peft dependency.
00276736
add: peft
c6255532
remove peft
c5317ff3
autocast device determination from accelerator
6a690abd
autocast
8c4eaf67
reduce lora rank.
cece7819
remove unneeded space
beb8aa2c
Apply suggestions from code review
33cb9d03
style
795cc9f9
remove prompt dropout.
042f3578
also save in native diffusers ckpt format.
283af651
debug
5e099a24
debug
71db43a2
debug
e1346d56
better formation of the null embeddings.
dfcf2340
remove space.
5ce6cc19
autocast fixes.
7ee9d5d9
autocast fix.
1b359ae8
hacky
82b628a3
remove lora_sayak
17d5c0dd
Apply suggestions from code review
fea95e0f
style
83801a69
make log validation leaner.
0c5d9348
Merge branch 'main' into lcm-lora-sdxl-datasets
3b034bea
move back enabled in.
0f42185e
fix: log_validation call.
41f19258
add: checkpointing tests
bf5c5d6b
Merge branch 'main' into lcm-lora-sdxl-datasets
64063c7a
Merge branch 'main' into lcm-lora-sdxl-datasets
53cf0e76
Merge branch 'main' into lcm-lora-sdxl-datasets
de958dc6
Merge branch 'main' into lcm-lora-sdxl-datasets
5824fa3b
Merge branch 'main' into lcm-lora-sdxl-datasets
f52cb6e7
taking my chances to see if disabling autocasting has any effect?
5534b0c2
resolve conflicts
3bacd82d
start debugging
1da3071f
name
bd4d1c43
name
26f16c18
name
91740272
more debug
92ba868d
more debug
1fba251b
index
3751ca9b
remove index.
63649d35
print length
05de5422
print length
5e604a8a
print length
8fecdda2
move unet.train() after add_adapter()
023866f8
disable some prints.
07c28de8
enable_adapters() manually.
c6a61dac
remove prints.
ec33085e
Merge branch 'main' into lcm-lora-sdxl-datasets
d14dd411
some changes.
ed7969d2
fix params_to_optimize
8c549e4b
more fixes
94460666
debug
0153665f
debug
b9891ffb
remove print
b11b0a6d
disable grad for certain contexts.
539bda39
Merge branch 'main' into lcm-lora-sdxl-datasets
dfe916dd
patil-suraj
marked this pull request as ready for review 2 years ago
Add support for IPAdapterFull (#5911)
d5a40cde
Fix a bug in `add_noise` function (#6085)
e3d76c47
[Advanced Diffusion Script] Add Widget default text (#6100)
472c3974
[Advanced Training Script] Fix pipe example (#6106)
373d3923
IP-Adapter for StableDiffusionControlNetImg2ImgPipeline (#5901)
be46b6eb
IP adapter support for most pipelines (#5900)
c7a87ca7
resolve conflicts
556b7977
Merge branch 'main' into lcm-lora-sdxl-datasets
a8d97858
fix: lora_alpha
47abcf6b
make vae casting conditional/
b7c0f95f
param upcasting
7a1d6c90
propagate comments from https://github.com/huggingface/diffusers/pull…
87f87a70
sayakpaul
changed the title [WIP][Training] Add `datasets` version of LCM LoRA SDXL [Training] Add `datasets` version of LCM LoRA SDXL 2 years ago
dg845
commented
on 2023-12-21
dg845
commented
on 2023-12-21
dg845
commented
on 2023-12-21
dg845
commented
on 2023-12-21
Merge branch 'main' into lcm-lora-sdxl-datasets
404351fa
[Peft] fix saving / loading when unet is not "unet" (#6046)
4c7e983b
[Wuerstchen] fix fp16 training and correct lora args (#6245)
0bb9cf02
[docs] fix: animatediff docs (#6339)
11659a6f
add: note about the new script in readme_sdxl.
f645b87e
Revert "[Peft] fix saving / loading when unet is not "unet" (#6046)"
fd64acf9
Revert "[Wuerstchen] fix fp16 training and correct lora args (#6245)"
121567b0
Revert "[docs] fix: animatediff docs (#6339)"
c24626ae
remove tokenize_prompt().
4c689b29
assistive comments around enable_adapters() and diable_adapters().
1b49fb92
Merge branch 'main' into lcm-lora-sdxl-datasets
9b3dbaaf
sayakpaul
merged
6683f979
into main 2 years ago
sayakpaul
deleted the lcm-lora-sdxl-datasets branch 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub