fix flash attention comment #33394
fix comment
633f4363
nbroad1881
changed the title fix comment fix flash attention comment 1 year ago
Update src/transformers/models/llama/modeling_llama.py
dd94152e
gante
approved these changes
on 2024-09-24
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub