diffusers
[DreamBooth] add text encoder LoRA support in the DreamBooth training script
#3130
Merged

[DreamBooth] add text encoder LoRA support in the DreamBooth training script #3130

sayakpaul merged 10 commits into main from dreambooth/text-enc
sayakpaul
sayakpaul2 years ago (edited 2 years ago)🎉 1

Following up on #2918. Closes #3065

Hyperparameter tuning might change the game but with the default hyperparameters (taken from the LoRA training section of DreamBooth), it seems to be doing okay.

Report: https://wandb.ai/sayakpaul/dreambooth-lora/reports/test-23-04-17-17-00-13---Vmlldzo0MDkwNjMy

Model repo (comes with a card if someone forgot): https://huggingface.co/sayakpaul/dreambooth

Sharing this cutie which was generated:
image

sayakpaul add: LoRA text encoder support for DreamBooth example.
43db9146
sayakpaul fix initialization.
7aaf8f88
sayakpaul fix: modification call.
d30bf8c4
sayakpaul add: entry in the readme.
e68bef9e
sayakpaul sayakpaul requested a review from patrickvonplaten patrickvonplaten 2 years ago
HuggingFaceDocBuilderDev
HuggingFaceDocBuilderDev2 years ago (edited 2 years ago)

The documentation is not available anymore as the PR was closed or merged.

patrickvonplaten
patrickvonplaten2 years ago

Great! Could we maybe also update the docs:
https://github.com/huggingface/diffusers/blob/main/docs/source/en/training/lora.mdx
and tests:
https://github.com/huggingface/diffusers/blob/main/examples/test_examples.py (think we've never added a lora test - my bad I added the dreambooth lora training script 😅 )

themrzmaster
themrzmaster commented on 2023-04-18
Conversation is marked as resolved
Show resolved
examples/dreambooth/train_dreambooth_lora.py
951 params_to_clip = (
952 itertools.chain(unet_lora_layers.parameters(), text_encoder_lora_layers.parameters())
953 if args.train_text_encoder
954
else text_encoder_lora_layers.parameters()
themrzmaster2 years ago (edited 2 years ago)

shouldn't it be
else unet_lora_layers.parameters()
?

sayakpaul2 years ago

Yes. Will change.

sayakpaul use dog dataset from hub.
9419667f
sayakpaul fix: params to clip.
9ec9f994
sayakpaul add entry to the LoRA doc.
83cbd20b
sayakpaul add: tests for lora.
8ad2d68e
sayakpaul
sayakpaul2 years ago

@patrickvonplaten ready for a review now. I have addressed your comments.

The tests were run using the command below:

python -m pytest -n 2 --max-worker-restart=0 --dist=loadfile -s -v -k "lora" examples/test_examples.py
innat-asj
innat-asj commented on 2023-04-19
docs/source/en/training/lora.mdx
176178```
177179
180It's also possible to additionally fine-tune the text encoder with LoRA. This, in most cases, leads
181
to better results with a slight increase in the compute. To allow fine-tuning the text encoder with LoRA,
innat-asj2 years ago

Would it be possible to add more experimental details of unet_lora and text_encoder_lora (and combined) in separate blogs (like dreambooth)?

sayakpaul2 years ago👍 1

We are leaving it to the community for now. But we will update if we find more :)

patrickvonplaten
patrickvonplaten approved these changes on 2023-04-20
patrickvonplaten2 years ago

Great job! Good to merge for me - we'll just need to resolve the merge conflicts

sayakpaul mrge conflicts.
3c9aa69f
sayakpaul remove unnecessary list comprehension./
1c5c5f19
sayakpaul sayakpaul merged 3045fb27 into main 2 years ago
sayakpaul sayakpaul deleted the dreambooth/text-enc branch 2 years ago
tcapelle
tcapelle2 years ago

I think you broke something... My Lora + dreambooth was working fine, and now I am unable to load the LoRA weights back:

pipeline.unet.load_attn_procs(lora_model_path)

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
Cell In[4], line 1
----> 1 pipeline.unet.load_attn_procs(lora_model_path)
      2 pipeline.to("cuda")

File ~/Apps/diffusers/src/diffusers/loaders.py:279, in UNet2DConditionLoadersMixin.load_attn_procs(self, pretrained_model_name_or_path_or_dict, **kwargs)
    276 attn_processors = {k: v.to(device=self.device, dtype=self.dtype) for k, v in attn_processors.items()}
    278 # set layers
--> 279 self.set_attn_processor(attn_processors)

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:513, in UNet2DConditionModel.set_attn_processor(self, processor)
    510         fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)
    512 for name, module in self.named_children():
--> 513     fn_recursive_attn_processor(name, module, processor)

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:510, in UNet2DConditionModel.set_attn_processor.<locals>.fn_recursive_attn_processor(name, module, processor)
    507         module.set_processor(processor.pop(f"{name}.processor"))
    509 for sub_name, child in module.named_children():
--> 510     fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:510, in UNet2DConditionModel.set_attn_processor.<locals>.fn_recursive_attn_processor(name, module, processor)
    507         module.set_processor(processor.pop(f"{name}.processor"))
    509 for sub_name, child in module.named_children():
--> 510     fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)

    [... skipping similar frames: UNet2DConditionModel.set_attn_processor.<locals>.fn_recursive_attn_processor at line 510 (3 times)]

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:510, in UNet2DConditionModel.set_attn_processor.<locals>.fn_recursive_attn_processor(name, module, processor)
    507         module.set_processor(processor.pop(f"{name}.processor"))
    509 for sub_name, child in module.named_children():
--> 510     fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)

File ~/Apps/diffusers/src/diffusers/models/unet_2d_condition.py:507, in UNet2DConditionModel.set_attn_processor.<locals>.fn_recursive_attn_processor(name, module, processor)
    505         module.set_processor(processor)
    506     else:
--> 507         module.set_processor(processor.pop(f"{name}.processor"))
    509 for sub_name, child in module.named_children():
    510     fn_recursive_attn_processor(f"{name}.{sub_name}", child, processor)

KeyError: 'down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor'
tcapelle
tcapelle2 years ago

Reverting to previous commits fix the issue

sayakpaul
sayakpaul2 years ago

It's better if you create a new issue thread for this problems you are facing with a reproducible snippet.

tcapelle
tcapelle2 years ago

Yeah sorry, but I just pulled and integrated that commit and it broke, and realised that you just had merged that.

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone