transformers
fix flash attention comment
#33394
Open

fix flash attention comment #33394

nbroad1881 wants to merge 2 commits into main from fix-flash-comment
nbroad1881
nbroad1881 fix comment
633f4363
nbroad1881 nbroad1881 changed the title fix comment fix flash attention comment 1 year ago
nbroad1881 nbroad1881 requested a review from gante gante 1 year ago
nbroad1881 nbroad1881 requested a review from ArthurZucker ArthurZucker 1 year ago
HuggingFaceDocBuilderDev
ArthurZucker
ArthurZucker approved these changes on 2024-09-21
nbroad1881 Update src/transformers/models/llama/modeling_llama.py
dd94152e
gante
gante approved these changes on 2024-09-24

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone