Go
Home
Pricing
FAQ
Install
Home
Pricing
FAQ
Install
Login
via GitHub
huggingface/peft
Pull Requests
Commits
Open
Closed
fix: layers_to_transform now correctly matches layer index on MoE models
#3028 opened 2026-02-07 19:56 by
Mr-Neutr0n
fix: correct typo 'occurence' to 'occurrence'
#3027 opened 2026-02-07 12:57 by
thecaptain789
[TinyLoRA] initial tinylora implementation
#3024 opened 2026-02-06 17:41 by
kashif
Fix docker CPU build
#3023 opened 2026-02-05 11:00 by
githubnemo
Add tooling to bump action version
#3022 opened 2026-02-04 12:44 by
githubnemo
update get_error_factor to cache up with the latest transformers change
#3021 opened 2026-02-04 06:01 by
jiqing-feng
Fix moe layers to transform
#3017 opened 2026-02-01 11:48 by
MichalMraz
fix: eliminate cross-terms bug when combining adapters with different weights
#3013 opened 2026-01-30 02:14 by
wingding12
Upgrade GitHub Actions for Node 24 compatibility
#3008 opened 2026-01-22 23:17 by
salmanmkc
[feature] Tiny modification to enable OFT for finetuning embedding layers
#3005 opened 2026-01-21 17:23 by
zqiu24
Fix error of PEFT with disable adapters and FSDP
#3001 opened 2026-01-19 10:48 by
Isalia20
Add get base model state dict
#3000 opened 2026-01-16 11:23 by
Isalia20
[WIP] Support transformers weight conversion
#2995 opened 2026-01-14 17:31 by
BenjaminBossan
Add AdaMSS tuner with Adaptive Subspace Allocation (ASA)
#2987 opened 2026-01-10 07:12 by
LonglongaaaGo
fix: auto-untie word embeddings on merge_and_unload when both are adapted
#2972 opened 2025-12-31 09:25 by
www-spam
Implements EWoRA in PEFT
#2971 opened 2025-12-31 09:17 by
HarshKohli
intorduce AdaDoRA implementation using both DoRA and AdaLoRa together
#2969 opened 2025-12-25 13:16 by
OrMullerHahitti
Add UniLoRA tuner to PEFT
#2968 opened 2025-12-23 16:14 by
KaiyangLi1992
Add DataLoader to MathQA benchmarking script
#2954 opened 2025-12-14 15:53 by
ParagEkbote
Integration of PVeRA
#2952 opened 2025-12-11 18:54 by
leofillioux
[FEAT] Add FeRA (Frequency-Energy Constrained Routing Adaptation) method
#2951 opened 2025-12-11 15:35 by
YinBo0927
FIX: warmup_ratio deprecated (fixes #2949)
wait-transformers-v5
#2950 opened 2025-12-11 15:10 by
shantanugupta2004
Intergrate MonteCLoRA (TMLR 2025 accepted) into PEFT
#2943 opened 2025-12-09 08:07 by
victor7246
Fix: Respect `inference_mode` when setting adapters with `modules_to_save` (Issue #2928)
#2931 opened 2025-11-29 13:09 by
ada-ggf25
ENH: Tie weights for target_modules in Lora (#2864)
#2879 opened 2025-10-29 06:01 by
romitjain
WoRA integration into PEFT
#2872 opened 2025-10-26 20:45 by
sambhavnoobcoder
Integrate ABBA to LoHA
#2842 opened 2025-10-16 05:34 by
monatis
[FEAT] Add LoReTTA
#2818 opened 2025-10-08 10:24 by
mbaddar1
Add conv2d support for fourierft and other improvements
#2794 opened 2025-09-21 19:44 by
frutiemax92
[WIP] Update `LoraConfig` for KaSA implementation
#2698 opened 2025-08-02 05:41 by
iambogeumkim
Older