diffusers
[LoRA] feat: add lora attention processor for pt 2.0.
#3594
Merged

[LoRA] feat: add lora attention processor for pt 2.0. #3594

sayakpaul merged 22 commits into main from feat/lora-attn-pt2
sayakpaul
sayakpaul feat: add lora attention processor for pt 2.0.
bfcd0ad1
sayakpaul sayakpaul requested a review from williamberman williamberman 2 years ago
sayakpaul sayakpaul requested a review from patrickvonplaten patrickvonplaten 2 years ago
HuggingFaceDocBuilderDev
williamberman
sayakpaul
sayakpaul explicit context manager for SDPA.
53c51997
sayakpaul switch to flash attention
00df0a41
sayakpaul make shapes compatible to work optimally with SDPA.
71b8ad27
sayakpaul resolve conflicts.
4142edb7
sayakpaul fix: circular import problem.
bf60598c
sayakpaul explicitly specify the flash attention kernel in sdpa
e5fad840
sayakpaul fall back to efficient attention context manager.
15190752
sayakpaul
sayakpaul remove explicit dispatch.
a193d265
patrickvonplaten
patrickvonplaten commented on 2023-06-01
patrickvonplaten
patrickvonplaten commented on 2023-06-01
patrickvonplaten
patrickvonplaten
sayakpaul
sayakpaul fix: removed processor.
7898c112
sayakpaul fix: remove optional from type annotation.
68674274
sayakpaul feat: make changes regarding LoRAAttnProcessor2_0.
4d3afd2d
sayakpaul remove confusing warning.
b694e3f9
sayakpaul formatting.
ffb136d8
sayakpaul relax tolerance for PT 2.0
8c304bca
sayakpaul fix: loading message.
9d12c34a
sayakpaul remove unnecessary logging.
3c3c2f7b
sayakpaul sayakpaul marked this pull request as ready for review 2 years ago
sayakpaul add: entry to the docs.
ba3f7ad2
sayakpaul
sayakpaul sayakpaul requested a review from patrickvonplaten patrickvonplaten 2 years ago
sayakpaul
sayakpaul commented on 2023-06-02
sayakpaul
sayakpaul commented on 2023-06-02
sayakpaul
sayakpaul commented on 2023-06-02
patrickvonplaten
patrickvonplaten commented on 2023-06-02
patrickvonplaten
patrickvonplaten approved these changes on 2023-06-02
sayakpaul merge main and resolve conflicts.
5017e922
sayakpaul
takuma104
sayakpaul
williamberman
williamberman approved these changes on 2023-06-05
sayakpaul Merge branch 'main' into feat/lora-attn-pt2
06e90167
sayakpaul add: network_alpha argument.
0c764515
sayakpaul relax tolerance.
b13c5df9
sayakpaul
sayakpaul sayakpaul merged 8669e831 into main 2 years ago
sayakpaul sayakpaul deleted the feat/lora-attn-pt2 branch 2 years ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone