fix: Load model with co-located adapter from local path (Granite Speech) (#43781)
* fix: Only overwrite the pretrained_model_name_or_path if needed with adapter
The check is based on the assumption that if the current value is a path on
disk and there is a `config.json` present in that path, the path points to
a full model checkpoint with an embedded adapter.
https://github.com/huggingface/transformers/issues/43746
Branch: PeftLocalCheckpoint-43746
Signed-off-by: Gabe Goodhart <ghart@us.ibm.com>
* test: Add a unit test to validate that the path in the adapter config does not override
https://github.com/huggingface/transformers/issues/43746
Branch: PeftLocalCheckpoint-43746
Signed-off-by: Gabe Goodhart <ghart@us.ibm.com>
* feat(peft, pipelines): Apply on-disk conditional to direct peft integration and pipelines init
https://github.com/huggingface/transformers/issues/43746
Branch: PeftLocalCheckpoint-43746
Signed-off-by: Gabe Goodhart <ghart@us.ibm.com>
* test: Add unit tests for peft_integration and pipeline local embedded adapters
https://github.com/huggingface/transformers/issues/43746
Branch: PeftLocalCheckpoint-43746
Signed-off-by: Gabe Goodhart <ghart@us.ibm.com>
AI-usage: draft
* styld: Comment improvements based on review feedback
https://github.com/huggingface/transformers/issues/43746
Branch: PeftLocalCheckpoint-43746
Signed-off-by: Gabe Goodhart <ghart@us.ibm.com>
AI-usage: none
* Apply suggestion from @Rocketknight1
---------
Co-authored-by: Matt <Rocketknight1@users.noreply.github.com>