Save checkpoint with TP #3096
feat: add hook for lora.Embedding
720c7d6f
fix: typos, deepcopy, minor
2c96acf0
feat: add saving support
f036b7c9
fix: remove embedding_colwise support for now
cf9fe8bc
refactor: cleanup implementation
c31f7dad
fix: broken conflict resolution during rebase
1c003287
fix: missing import and tests
dec1e01e
fix: replace logger.warning by warnings.warn
96e7d5cf
fix: make tp work with low level api
d5cb7389
style: ruff
dfcf687c
style: docstring
762c3c30
fix: restore unrelated files
70de544b
refactor: move TpInfo
4afada5f
fix: remove unnecessary check
70dd58de
refactor: simplify iteration on tp_plan
227baf3b
refactor: use fixture
76353c29
refactor: import safetensors at module level
4bff6d5c
feat: check if tp plan exists before looping over modules
e31d715f
feat: load_adapter support
49b460dc
test: use monitored barrier
a7a52e9f
test: rename tests
5a2a4f7c
fix: test name
c6b67168
test: skip if not latest transformers release
852a5049
feat: update function signature for TP hooks
a4417a1f
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub