Go
Home
Pricing
FAQ
Install
Home
Pricing
FAQ
Install
Login
via GitHub
huggingface/peft
Pull Requests
Commits
Open
Closed
miss update
#3122 opened 2026-03-30 08:48 by
Joluck
FIX Cache position is None with transformers v5.4
#3120 opened 2026-03-27 16:42 by
BenjaminBossan
DOC Update contribution guidelines
#3119 opened 2026-03-26 14:41 by
BenjaminBossan
[WIP] Generic quantization support for PEFT methods
#3117 opened 2026-03-25 17:23 by
BenjaminBossan
FIX Several bugs when adding merged LoRA weights
#3111 opened 2026-03-18 13:16 by
BenjaminBossan
CI Move slow EVA tests to nightly GPU CI
#3108 opened 2026-03-17 11:15 by
BenjaminBossan
Add KappaTuneSelector: condition-number-based automatic LoRA target selection
#3106 opened 2026-03-16 17:40 by
oswaldoludwig
FIX CI Remove invalid arg in nightly GPU test call
#3104 opened 2026-03-16 15:53 by
BenjaminBossan
fix: Handle FSDP-sharded parameters in LoRA ParamWrapper get_delta_weight
#3102 opened 2026-03-16 12:42 by
BillionClaw
FIX Broken tests with torchao >= 0.15
#3101 opened 2026-03-16 10:55 by
BenjaminBossan
add glora
#3098 opened 2026-03-13 01:57 by
not-lain
Save checkpoint with TP
#3096 opened 2026-03-12 20:27 by
michaelbenayoun
Implement import allowlist in AutoPeftModel, limit access in megatron
#3090 opened 2026-03-11 11:38 by
githubnemo
Changes for transformers 5 weight conversion
#3083 opened 2026-03-05 17:23 by
BenjaminBossan
Image generation benchmark
#3082 opened 2026-03-04 11:48 by
BenjaminBossan
[method_comparison] RL training based method comparison for lora adapters
#3078 opened 2026-03-03 12:35 by
kashif
WIP: Support tranformers weight conversion
#3071 opened 2026-02-27 13:22 by
githubnemo
DOC: Section on weight tying with LoRA
#3066 opened 2026-02-25 10:41 by
BenjaminBossan
ENH Support models with low precision float dtypes
#3055 opened 2026-02-20 16:05 by
BenjaminBossan
Refactor layer initialization: PR 2960 continued
wip
#3047 opened 2026-02-17 18:50 by
BenjaminBossan
fix: layers_to_transform now correctly matches layer index on MoE models
#3028 opened 2026-02-07 19:56 by
Mr-Neutr0n
[TinyLoRA]tinylora implementation
#3024 opened 2026-02-06 17:41 by
kashif
Fix moe layers to transform
#3017 opened 2026-02-01 11:48 by
MichalMraz
Add get base model state dict
#3000 opened 2026-01-16 11:23 by
Isalia20
Add AdaMSS tuner with Adaptive Subspace Allocation (ASA)
#2987 opened 2026-01-10 07:12 by
LonglongaaaGo
Add UniLoRA tuner to PEFT
#2968 opened 2025-12-23 16:14 by
KaiyangLi1992
Intergrate MonteCLoRA (TMLR 2025 accepted) into PEFT
#2943 opened 2025-12-09 08:07 by
victor7246
WoRA integration into PEFT
wip
#2872 opened 2025-10-26 20:45 by
sambhavnoobcoder
Pull Request: Adding HiRA integration into PEFT library
#2668 opened 2025-07-24 10:32 by
hqsiswiliam
FEAT Add sine-LoRA #2434
wip
#2457 opened 2025-03-27 06:55 by
yipingji
Older