[FEAT] New LoRA Initialization Method: Explained Variance Adaptation #2142
initial commit
ec630939
add target modules based on eva_state_dict
dc26e7e5
remove default args
70171c8a
cleanup
f537b8ed
set correct prefix for state dict loading
a3ae4bae
revert
dd5a38b1
update docstrings, minor changes
2cfdf0c9
update docstrings, integrate peft changes
f3f89c6b
fix issues related to config having either type PeftConfig or dict
b71b4222
move state_dict modification to function
571eca8b
update paper link
bec1f80d
remove comments
fc343675
add documentation
0e8d29ce
add docstrings
c41e8204
update
eff0c9da
fix docs and add default evaconfig
1c912726
add eva init to peft namespace
57799713
add test for lowrank argument
b5968d29
add check
a397418f
simplify arguments
57b72b47
update docstrings
073f0a1e
optimize indices calc
6d0e0d7d
sirluk
force pushed
from
2779bf34
to
6d0e0d7d
334 days ago
Update src/peft/tuners/lora/eva.py
3574109f
add warning
91e87e6f
update
a133e8a1
add check if all hooks have been removed
523f1cda
extend documentation
261c81e6
make style
566ebf04
add tests for eva
e940efab
add licence notice
ba82bd51
add tests for lora_config with eva
202f933a
update
c6a5fc52
fix tau range
413be29c
Merge branch 'main' into main
5ad5f00d
update tau tests
b1bcf02f
add validity checks to initialize_lora_eva_weights
e8908394
style
ebb0ac62
extend documentation and small fixes
81fbc284
improve customization options
62ae35c2
Merge pull request #3 from sirluk/simplify_entrypoints
0b316d36
update documentation
7c753b6b
update docs
404022f0
error to warning
724fd1c0
make style
d79672b1
fix type
ad646da1
fix potential issues
51874665
add option to adjust alpha after redistribution
b6b7b7bc
Merge pull request #4 from sirluk/alpha_pattern
13ffc56b
Update src/peft/tuners/lora/eva.py
17f5bf14
fix edge cases
1f03a954
Merge pull request #5 from sirluk/alpha_pattern
f74b0447
Merge branch 'main' into main
12d497f9
account for layer pattern
e6099695
split up print
7e4505f4
fix rank_budget in case of rank_pattern
bec670fb
Merge branch 'huggingface:main' into main
b13d0144
small fixes
b24500c6
missing return statement
3119dc42
sirluk
changed the title New LoRA Initialization Method: Explained Variance Adaptation [FEAT] New LoRA Initialization Method: Explained Variance Adaptation 320 days ago
Merge branch 'main' into main
ec408974
move dataloader none check
05f99ba6
adjust default value
c628afe5
update test threshold
69896629
update docs
5982139b
remove speed test and update docs
b5c6b8fe
fix typo
2c6fb37e
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub