transformers
Remove deprecated use_flash_attention_2 parameter
#37131
Merged

Remove deprecated use_flash_attention_2 parameter #37131

cyyever
github-actions github-actions marked this pull request as draft 284 days ago
github-actions
cyyever cyyever force pushed from 29573a31 to 2a99df0a 284 days ago
cyyever cyyever force pushed from 2a99df0a to 8ad42179 284 days ago
cyyever cyyever force pushed from 8ad42179 to c81c9c9d 284 days ago
cyyever cyyever marked this pull request as ready for review 284 days ago
github-actions github-actions requested a review from ArthurZucker ArthurZucker 284 days ago
github-actions github-actions requested a review from ydshieh ydshieh 284 days ago
cyyever cyyever force pushed from c81c9c9d to 7a8f45ad 284 days ago
Rocketknight1
cyyever cyyever force pushed from 7a8f45ad to 2403a813 284 days ago
ydshieh ydshieh removed review request from ydshieh ydshieh 280 days ago
cyyever cyyever force pushed from fb746849 to 4ef791d9 262 days ago
cyyever cyyever force pushed from 4ef791d9 to d1fe7d0d 262 days ago
cyyever Remove deprecated use_flash_attention_2 parameter
039bde59
cyyever cyyever force pushed from d1fe7d0d to 039bde59 262 days ago
ArthurZucker
ArthurZucker approved these changes on 2025-06-02
ArthurZucker ArthurZucker merged fde1120b into main 221 days ago
cyyever cyyever deleted the use_flash_attention_2 branch 221 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone