Fix GraniteMoeHybrid in transformers v5 (#42872)
* apply_rotary_pos_emb should be called
* fix position_embeddings usage in granitemoehybrid
* setting `self.rotary_emb` to None only in hybrid models. Safer, since all modules are highly modular.
* minor
* adding `position_embedding_type` to the config.
* review cleanup
* modeling too
* rewrite conditionally applying rope
* resolve rotary_emb issue