accelerate
Avoid duplicating memory for tied weights in `dispatch_model`, and in forward with offloading
#2330
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
12
Changes
View On
GitHub
Avoid duplicating memory for tied weights in `dispatch_model`, and in forward with offloading
#2330
fxmarty
merged 12 commits into
main
from
fix-dispatch-model-tied-params-memory
wip
982e6056
fix
373a6bea
add test
641a22f8
cleanup
ec2d94f0
style
ae573266
style & tests pass
829d33af
fxmarty
changed the title
WIP
Avoid duplicating memory for tied weights in `dispatch_model`
2 years ago
fxmarty
marked this pull request as ready for review
2 years ago
fxmarty
requested a review
from
SunMarc
2 years ago
fxmarty
requested a review
from
muellerzr
2 years ago
fxmarty
commented on 2024-01-12
fxmarty
commented on 2024-01-12
SunMarc
approved these changes on 2024-01-12
SunMarc
commented on 2024-01-12
fxmarty
commented on 2024-01-15
fix offload, submodules
f199f6ba
fxmarty
force pushed
from
90ee1bcb
to
f199f6ba
2 years ago
cleanup
4b95a1d1
fxmarty
commented on 2024-01-15
fxmarty
commented on 2024-01-15
fxmarty
changed the title
Avoid duplicating memory for tied weights in `dispatch_model`
Avoid duplicating memory for tied weights in `dispatch_model`, and in forward with offloading
2 years ago
SunMarc
approved these changes on 2024-01-15
Update tests/test_big_modeling.py
2ea8986d
Update tests/test_big_modeling.py
f9ecf75d
disk offloading do not reload tied parameters in memory
30631a65
fxmarty
commented on 2024-01-16
fxmarty
requested a review
from
SunMarc
2 years ago
SunMarc
approved these changes on 2024-01-16
remove outdated comment
c1ea6f24
muellerzr
approved these changes on 2024-01-16
fxmarty
merged
6719cb6d
into main
2 years ago
fxmarty
deleted the fix-dispatch-model-tied-params-memory branch
2 years ago
fxmarty
restored the head branch
2 years ago
Login to write a write a comment.
Login via GitHub
Reviewers
SunMarc
muellerzr
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub