[Llama FA2] Re-add _expand_attention_mask and clean a couple things (#27074)
* clean
* clean llama
* fix more
* make style
* Apply suggestions from code review
* Apply suggestions from code review
* Update src/transformers/models/llama/modeling_llama.py
* Update src/transformers/models/llama/modeling_llama.py
* Apply suggestions from code review
* finish
* make style