Make gradient-checkpoint enabling tolerant of models without get_input_embeddings #42558
add embedding getter
d5909a7c
modify your own logic
a9eb6348
a common test
b520cc7d
some adapters are not PreTrainedModel s
7ce45fea
few fixes
d41e2046
implement correct-ish fix?
0e93a613
fixup
de8ff713
this is needed likely
b2618b3d
woops
5d61150c
solving some cross-imports issues here and there
ef554994
more ximports issues
44ab4c60
finally
fe89c1ce
revert changes
2920d002
fixups
b8ccd0f2
improve message
b5ae5a60
add common tests for input_ids first
d209ff5e
increase test coverage
79665d46
Merge branch 'main' into fix_enable_grads_again
844c7072
bigger update for GC
fcc84a40
copies
e970fad3
molbap
commented
on 2025-12-04
mlcd is getting on my nerves a bit
b4f5c15d
ah yes
0246a700
for BC
81940dd2
molbap
commented
on 2025-12-04
molbap
commented
on 2025-12-04
molbap
changed the title Add embedding getter + test Make gradient-checkpoint enabling tolerant of models without get_input_embeddings 96 days ago
break a couple modelings
284189ab
Merge branch 'main' into fix_enable_grads_again
1079eef1
simplify with base_model
f4795980
fix copies for torch checkpointing
73b4f5dc
simplify this model
0e7086f4
Merge branch 'main' into fix_enable_grads_again
18d44ba6
improve messages
00cc6696
Merge branch 'main' into fix_enable_grads_again
d9d74429
molbap
merged
b712a97d
into main 83 days ago
molbap
deleted the fix_enable_grads_again branch 83 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub