FEAT Add function to convert non-LoRA PEFT adapters to LoRA #2939
[WIP] FEAT Add function to convert to LoRA
f91f54b7
BenjaminBossan
changed the title [WIP] FEAT Add function to convert to LoRA [WIP] FEAT Add function to convert non-LoRA PEFT adapters to LoRA 16 days ago
Add tests for conversion
eb14cfba
More testing, docs, fix small issues
ab3f38c7
fix bug with other adapter name
9469f1f9
Support more PEFT types
64f5e8b5
Use hidden_states for comparison in tests
5614fb91
Fix for target_modules being str
9bb42eaf
Fixes for WaveFT, FourierFT, ShiRA
8e4fb0b6
Add common tests
f40887ed
Fix for Conv1D
a4aa2a55
Extend documentation
7846f97b
For testing conversion, ensure eval model
bed2a846
Merge branch 'main' into feat-lora-conversion
28c34767
Correctly deal with threshold = 1.0
32553205
Deal with rank=0
c374a513
Clean up docstrings, better wording, error message
ab3c492f
Simplify rank pattern logic a bit
f110cbc2
Deal with modules_to_save and bias
f9cd7aa3
Reviewer feedback
d8945fc2
Add support for compilation
d05997a6
Simplify tests a little
24005265
Add stable diffusion test with Flux2
99b45213
Can't check for torch.compile on CPU
207b3317
Redo conversion experiment with more rigour
2762309b
Make style
2614894a
Fix table formatting
b8296ed6
Skip compile test if not Linux
86e7ce9e
Table in docs shows change in accuracy, memory
d6650c08
Fix issue with memory measure in conversion script
89647cb6
Fix return type in set_peft_model_state_dict
a041c57a
Allow compile without kwargs
46fcba4d
Fix issue with half precision weight dtypes
3e7ef47f
Update docs
05c60911
BenjaminBossan
changed the title [WIP] FEAT Add function to convert non-LoRA PEFT adapters to LoRA FEAT Add function to convert non-LoRA PEFT adapters to LoRA 13 days ago
BenjaminBossan
marked this pull request as ready for review 13 days ago
sayakpaul
approved these changes
on 2025-12-12
Fix copyright notice
62d4bed6
Better document dynamic rank
be8b3029
Adjust table formatting, values are unchanged
9bb6c95b
Improve/fix comments
ce506bea
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub