peft
LoRA and Transformers TP
#3079
Merged

LoRA and Transformers TP #3079

michaelbenayoun
michaelbenayoun wip(tp): add hooks to LoRA adapters for TP
e90907a3
BenjaminBossan
michaelbenayoun feat: add hooks to LoRA adapters for TP plan
17b96ee1
michaelbenayoun wip: shard LoRA adapters for TP
8e8ea685
michaelbenayoun michaelbenayoun force pushed from be162533 to 8e8ea685 14 days ago
michaelbenayoun wip: add TP hooks to adapters
a4806952
michaelbenayoun feat: add hooks for TP in LoraModel
744a0a53
michaelbenayoun fix: add lora adapter weight broadcasting after initialization
d934e4c9
michaelbenayoun style: add comments and remove space
99260d18
michaelbenayoun fix: load and shard from checkpoints with TP
03c94fb6
michaelbenayoun test: add test suites for LoRA + Transformers TP
69e80296
michaelbenayoun refactor: rename test
67d1b392
michaelbenayoun refactor: rename test file
7000ae84
michaelbenayoun style: fix length
4cb247b4
michaelbenayoun michaelbenayoun requested a review from BenjaminBossan BenjaminBossan 12 days ago
michaelbenayoun michaelbenayoun marked this pull request as ready for review 12 days ago
3outeille
3outeille commented on 2026-03-09
3outeille
michaelbenayoun test: add overfitting test
e9536e94
michaelbenayoun fix: remove comma and unnecessary if statement
1c6d4a06
HuggingFaceDocBuilderDev
michaelbenayoun style: torch.distributed to dist
25e7926d
michaelbenayoun
michaelbenayoun Merge branch 'main' into lora_and_tp
4ce5ba38
michaelbenayoun fix: move adapters to device before broadcast
38865ff2
BenjaminBossan
BenjaminBossan requested changes on 2026-03-12
michaelbenayoun fix: check for correct transformers version
dbcbbddd
michaelbenayoun fix: lazy import to avoid failing with older transformers versions
fe773497
michaelbenayoun doc: mention TP support in LoRA docs
07daec84
michaelbenayoun style: extend comment on empty_param
786fceb7
michaelbenayoun refactor: remove duplicated code
4fa66822
michaelbenayoun refactor: remove duplicated code when adding the hooks
ed5e33ce
michaelbenayoun style: ruff format
89b3f35d
michaelbenayoun test: move tp tests to gpu
0d238ec4
michaelbenayoun test: remove test file
fb0e31a8
michaelbenayoun fix: typos and arguments
e254738e
michaelbenayoun fix: restore pyproject.toml
32eeb5a7
BenjaminBossan
BenjaminBossan commented on 2026-03-17
michaelbenayoun fix: tests on GPUs
040111e6
michaelbenayoun fix: use internal model
c3d15fb4
michaelbenayoun michaelbenayoun requested a review from BenjaminBossan BenjaminBossan 1 day ago
BenjaminBossan
BenjaminBossan requested changes on 2026-03-17
michaelbenayoun fix: restore pyproject.toml
75c2f3b7
michaelbenayoun fix: change repo name to internal
14a29722
michaelbenayoun test: launch tp training integration test
749ee5bb
michaelbenayoun refactor: add decorator to the test class
26fa93f5
michaelbenayoun Merge branch 'main' into lora_and_tp
7f14a565
michaelbenayoun michaelbenayoun requested a review from BenjaminBossan BenjaminBossan 1 day ago
BenjaminBossan
BenjaminBossan commented on 2026-03-18
michaelbenayoun fix: add missing config file
ecb29ed4
BenjaminBossan
BenjaminBossan approved these changes on 2026-03-18
BenjaminBossan BenjaminBossan merged 3fb7842e into main 13 hours ago
michaelbenayoun michaelbenayoun deleted the lora_and_tp branch 11 hours ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone