peft
fixing multiple LoRA in the same batch or vit
#1990
Merged

fixing multiple LoRA in the same batch or vit #1990

saeid93
saeid93 fixing multiple LoRA in the same batch or vit
2579b85c
BenjaminBossan
BenjaminBossan commented on 2024-08-06
saeid93
BenjaminBossan
saeid93 removed patching by inheritance and used pytorch pre_hook instead
fd0a9ce2
saeid93
BenjaminBossan
BenjaminBossan requested changes on 2024-08-07
saeid93
BenjaminBossan
saeid93
BenjaminBossan
saeid93 added the test, mixed batch and forward arg functions
6b0290f1
saeid93
BenjaminBossan
BenjaminBossan requested changes on 2024-08-26
saeid93 saeid93 marked this pull request as draft 1 year ago
saeid93 changed the lora test layer
d143b131
saeid93 added handling specific layers in mixed batch forward
a46ad627
saeid93 updated documentation with modules_to_save information and caveats
d3bce93e
BenjaminBossan
BenjaminBossan
saeid93 added tests for module_to_save with ModelEmbConv1D and tests for type…
60384bd3
saeid93 saeid93 requested a review from BenjaminBossan BenjaminBossan 1 year ago
saeid93 saeid93 marked this pull request as ready for review 1 year ago
saeid93
BenjaminBossan
BenjaminBossan requested changes on 2024-09-16
saeid93 clarification on docs and comments applied
683da8ba
saeid93
BenjaminBossan
saeid93 removed extra import
e0a12b3c
saeid93
BenjaminBossan
saeid93 added annotation backward compatibility
bed1a104
saeid93
BenjaminBossan
BenjaminBossan approved these changes on 2024-09-17
BenjaminBossan BenjaminBossan merged adf0a1dc into main 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone