transformers
d7cb5e13 - [Llama FA2] Re-add _expand_attention_mask and clean a couple things (#27074)

Commit
2 years ago
[Llama FA2] Re-add _expand_attention_mask and clean a couple things (#27074) * clean * clean llama * fix more * make style * Apply suggestions from code review * Apply suggestions from code review * Update src/transformers/models/llama/modeling_llama.py * Update src/transformers/models/llama/modeling_llama.py * Apply suggestions from code review * finish * make style
Parents
Loading