Add forward compat. for tied_weights_keys dicts (#2902)
* Add forward compat. for tied_weights_keys dicts
In the future, `_tied_weights_keys` will be mappings (mapping destination as key and source as value).
This will also mean that the semantic of `_tied_weights_keys` will change from "the keys in this list are
tied to the input embedding" to "this mapping defines tying between layers of any type". To this end we'll
limit the scope and provide methods to retrieve input embedding ties.
If the need arises to retrieve more complicated mappings, we can do so at a later point.
* Redo _get_module_names_tied_with_embedding
- don't loop over all modules, assume that the module tying is
defined on the top-level (there's no precedent for the opposite yet)
- make sure that only the base model is considered to prevent
duplicates due to `getattr` forwarding by PEFT
Also added a few more tests.
* Address feedback and resolve a few bugs
- PEFT modifications to the model (e.g., adding `(.*)?.base_layer.`) were not considered but
should be handled now
- Future testing (transformers >5) needs to switch the roles in the tests (dict is provided, list must be simulated)
- local imports are replaced by using `hasattr` calls to check for specific attributes instead