peft
WoRA integration into PEFT
#2872
Open

WoRA integration into PEFT #2872

sambhavnoobcoder
sambhavnoobcoder Add WoRA core implementation with weighted direction and learnable al…
ae503aae
sambhavnoobcoder Update LoraLayer to support WoRA with alpha/beta parameter storage
e42daab5
sambhavnoobcoder Update Linear, Embedding, and Conv layer classes to support WoRA vari…
dca53dd4
sambhavnoobcoder Add use_wora parameter propagation in LoraModel layer creation
e8733fb5
sambhavnoobcoder Fix parameter ordering in update_layer methods for Python syntax comp…
a11c3384
sambhavnoobcoder Fix WoRA parameter registration and config validation for complete gr…
cc8fbc07
sambhavnoobcoder Enable gradient flow through WoRA alpha and beta parameters
633a8500
sambhavnoobcoder Set requires_grad=True for WoRA alpha and beta parameters
209752e4
sambhavnoobcoder Move wora_alpha and wora_beta to adapter_layer_names for trainability
7d9725b0
sambhavnoobcoder Rename wora_alpha/wora_beta to lora_wora_alpha/lora_wora_beta for pre…
01b18601
sambhavnoobcoder Add WoRA tests to test_lora_variants.py
9f2ec488
sambhavnoobcoder Add WoRA tests and fix gradient flow issues
a0dc6e0c
github-actions
sambhavnoobcoder
github-actions
sambhavnoobcoder

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone