fixing multiple LoRA in the same batch or vit #1990
fixing multiple LoRA in the same batch or vit
2579b85c
removed patching by inheritance and used pytorch pre_hook instead
fd0a9ce2
added the test, mixed batch and forward arg functions
6b0290f1
saeid93
marked this pull request as draft 1 year ago
changed the lora test layer
d143b131
added handling specific layers in mixed batch forward
a46ad627
updated documentation with modules_to_save information and caveats
d3bce93e
added tests for module_to_save with ModelEmbConv1D and tests for type…
60384bd3
saeid93
marked this pull request as ready for review 1 year ago
clarification on docs and comments applied
683da8ba
removed extra import
e0a12b3c
added annotation backward compatibility
bed1a104
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub