ENH: Tie weights for target_modules in Lora (#2864) #2879
Tests and inital implementation for embed_tokens
4c6d15f1
romitjain
changed the title Tie weights for target_modules in Lora Tie weights for target_modules in Lora (#2864) 57 days ago
Minor fixes
4b912208
Fixed all tests and made updates to logic
46b803e4
romitjain
marked this pull request as ready for review 54 days ago
romitjain
changed the title Tie weights for target_modules in Lora (#2864) ENH: Tie weights for target_modules in Lora (#2864) 54 days ago
Nit
37b1e064
Added contigious check for export
8388aa86
Apply suggestion from @BenjaminBossan
cd6c6d0f
Addressed PR comments
0cb44e8b
Update src/peft/tuners/lora/model.py
628ce10f
Update src/peft/tuners/lora/model.py
602ce107
Apply suggestions from code review
e2d0345f
Removed redundant change
78800327
Merge branch 'enh/tie-target-modules' of github.com:romitjain/peft in…
f73af50f
Handling target_modules as str
46cca1e0
Update src/peft/tuners/tuners_utils.py
2267a480
Updated regex matching
5d5b8e43
Apply suggestion from @BenjaminBossan
c7cfe406
Added find layer by tensor
8294ec73
Merge branch 'main' of github.com:romitjain/peft into enh/tie-target-…
7370a21e
Fixed tests
1da895f0
Nit
d86ff7d0
Small fix to ensure correct layer name gets saved for target modules
dc03dd4a
Merge branch 'main' of github.com:huggingface/peft into enh/tie-targe…
c79a64c7
Merge branch 'main' of github.com:huggingface/peft into enh/tie-targe…
0715451b
Apply suggestions from code review
dbb00960
Merge branch 'enh/tie-target-modules' of github.com:romitjain/peft in…
06d4b7fe
romitjain
force pushed
from
cd2e82c8
to
1cd9137b
8 days ago
Updated matching logic
67a71d63
romitjain
force pushed
from
1cd9137b
to
67a71d63
8 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub