peft
LoRA-GA Integration
#2926
Open

LoRA-GA Integration #2926

sambhavnoobcoder
sambhavnoobcoder Add LoRA-GA core implementation
b9c368d6
sambhavnoobcoder Register and export LoRA-GA across PEFT
8128a564
sambhavnoobcoder Add comprehensive test suite for LoRA-GA
c5743ee5
sambhavnoobcoder Add LoRA-GA documentation
ec5490ca
sambhavnoobcoder Add LoRA-GA example script and README
5074bc89
sambhavnoobcoder Add paper reference to lora_ga_utils module docstring
96f0ec2b
BenjaminBossan
BenjaminBossan commented on 2025-11-28
sambhavnoobcoder Refactor LoraGAConfig to sub-config pattern
0aefeb68
sambhavnoobcoder Add preprocess_loraga function for gradient estimation
6e369829
sambhavnoobcoder Refactor lora_ga_init with SVD fix and fallback handling
63611513
sambhavnoobcoder Update LoRA exports for new LoRA-GA API
a9d52846
sambhavnoobcoder Add lora_ga to save_mutated_as_lora pattern
0b10a86a
sambhavnoobcoder Remove old LoRA-GA utilities from exports
fd831f77
sambhavnoobcoder Update LoRA-GA test suite for new API
db5a0414
sambhavnoobcoder Update LoRA-GA example and documentation
6cb8bed1
sambhavnoobcoder Remove LoraGAModel from peft package exports
d5f4bb74
sambhavnoobcoder Remove LoraGAModel from tuners package exports
840509f4
sambhavnoobcoder Remove LoraGAModel class from lora model
d65fa574
sambhavnoobcoder Delete old LoRA-GA utilities file
91a80b97
sambhavnoobcoder Remove LORAGA from PeftType enum
fa1905b9
sambhavnoobcoder Export preprocess_loraga from peft package
31f798ac
sambhavnoobcoder Add preprocess_loraga to tuners __all__ list
2f1c2096
sambhavnoobcoder
sambhavnoobcoder sambhavnoobcoder requested a review from BenjaminBossan BenjaminBossan 22 days ago
UnbearableFate
sambhavnoobcoder
BenjaminBossan
BenjaminBossan requested changes on 2025-12-04
sambhavnoobcoder Remove cache_file from LoraGAConfig dataclass
78baafd4
sambhavnoobcoder Refactor LoRA-GA preprocessing: add cache_file parameter, use _peft_ …
01014c1b
sambhavnoobcoder Fix attribute deletion to use _peft_ prefixed names in layer.py
8b9b6b0f
sambhavnoobcoder Update tests to use _peft_loraga_grad attribute name
5f70ef42
sambhavnoobcoder Update examples: move data_iter inside train_step, add script descrip…
ff9329dd
sambhavnoobcoder Update docs: change copyright to 2025, update usage tips, remove unve…
2eeea7c6
sambhavnoobcoder Add LoRA-GA to warning about rslora with rank_pattern when modifying …
81d6985d
sambhavnoobcoder Add documentation for path_initial_model_for_weight_conversion usage …
747befa9
sambhavnoobcoder Pass lora_ga_config as parameter instead of attaching to modules
0156a34b
sambhavnoobcoder sambhavnoobcoder requested a review from BenjaminBossan BenjaminBossan 16 days ago
sambhavnoobcoder
BenjaminBossan
sambhavnoobcoder Merge upstream/main into loraga-integration branch
19d4e0fe
sambhavnoobcoder
BenjaminBossan
BenjaminBossan requested changes on 2025-12-08
sambhavnoobcoder Add Conv1D support and improve gradient estimation efficiency in LoRA-GA
a306f1b2
sambhavnoobcoder Update documentation to clarify LoRA-GA does not support quantized mo…
2d5022e6
sambhavnoobcoder Remove unnecessary note from README
7abaacae
sambhavnoobcoder Improve error message for missing lora_ga_config
e9f101c5
sambhavnoobcoder Remove redundant test_gradient_shapes test
75b0bfeb
sambhavnoobcoder Remove unnecessary TestLoraGAConfig class
f438cf5f
sambhavnoobcoder Remove TestLoraGAInitialization class with low-value tests
5b739f80
sambhavnoobcoder Move os import to root level
62128459
sambhavnoobcoder Use pytest tmp_path fixture instead of tempfile
8697f43e
sambhavnoobcoder Convert get_model_and_train_step to pytest fixtures
c58bde35
sambhavnoobcoder Enhance save/load tests with parametrization and fix random direction…
35164820
sambhavnoobcoder Add random seed for test reproducibility
c9690c25
sambhavnoobcoder Add test for cached gradients
49f0e104
sambhavnoobcoder Update copyright year to 2025
c6ccdda9
sambhavnoobcoder Raise error when lora_ga_config is missing with init_lora_weights='lo…
a7e22b59
sambhavnoobcoder Rename target_modules to get_target_modules
4fecab7e
sambhavnoobcoder Add lora_ga_config param to LoraParallelLinear and run style formatting
4b12fba5
sambhavnoobcoder
sambhavnoobcoder sambhavnoobcoder requested a review from BenjaminBossan BenjaminBossan 14 days ago
BenjaminBossan
BenjaminBossan requested changes on 2025-12-11
sambhavnoobcoder Simplify test fixtures
2173498e
sambhavnoobcoder Add tests for lower precision dtypes and quantized model rejection
800d5c8c
sambhavnoobcoder Add support for mixed models with unsupported layer types in LoRA-GA
5f6135a8
sambhavnoobcoder sambhavnoobcoder requested a review from BenjaminBossan BenjaminBossan 13 days ago
sambhavnoobcoder
BenjaminBossan
BenjaminBossan requested changes on 2025-12-12
sambhavnoobcoder Remove non-existent LoraGAModel from documentation
476bed7f
sambhavnoobcoder Make target_modules a CLI argument in example script
8d919ee3
sambhavnoobcoder Auto-populate target_modules from defaults in preprocess_loraga
641b2043
sambhavnoobcoder Use eval_strategy instead of evaluation_strategy
37e57f2b
sambhavnoobcoder Make gradient computation memory-efficient by disabling gradients for…
d4db2b60
sambhavnoobcoder Use forward hooks to correctly count gradient computations across mul…
dcc9a2d6
sambhavnoobcoder Separate test parametrizations for direction and scale with explicit …
bc402b90
sambhavnoobcoder Add helper function for BitsAndBytes quantized LoRA-GA training
5c164249
BenjaminBossan
BenjaminBossan requested changes on 2025-12-18
sambhavnoobcoder Move imports to top of file in LoRA-GA example
2304c1c6
sambhavnoobcoder Simplify gradient estimation using backward hooks with accumulation
84521a2a
sambhavnoobcoder Simplify gradient accumulation following PyTorch best practices
7570aa24
sambhavnoobcoder Remove helper function and document quantization workflow in README
aee93563
sambhavnoobcoder Lower default learning rate to 3e-5 for LoRA-GA
3ba1fb22
sambhavnoobcoder
sambhavnoobcoder sambhavnoobcoder requested a review from BenjaminBossan BenjaminBossan 5 days ago
BenjaminBossan
BenjaminBossan commented on 2025-12-19
Outsider565
sambhavnoobcoder Fix dtype precision bug in LoRA-GA weight modification
14d59839
sambhavnoobcoder
sambhavnoobcoder sambhavnoobcoder requested a review from BenjaminBossan BenjaminBossan 2 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone