Fix resized LM head weights being overwritten by post_init (#45079)
When `tie_word_embeddings=False`, `_get_resized_lm_head()` creates a new
`nn.Linear` without `_is_hf_initialized`, causing `post_init()` to
reinitialize its weights. Set the flag after weight copying is done.
Fixes #35141