transformers
183f442b - Fix resuming PeftModel checkpoints in Trainer (#24274)

Commit
2 years ago
Fix resuming PeftModel checkpoints in Trainer (#24274) * Fix resuming checkpoints for PeftModels Fix an error occurred when resuming a PeftModel from a training checkpoint. That was caused since PeftModel.pre_trained saves only adapter-related data while _load_from_checkpoint was expecting a torch sved model. This PR fix this issue and allows the adapter checkpoint to be loaded. Resolves: #24252 * fix last comment * fix nits --------- Co-authored-by: younesbelkada <younesbelkada@gmail.com>
Parents
Loading