transformers
Fix warning for output_attentions=True
#40597
Merged

Fix warning for output_attentions=True #40597

qubvel
qubvel Fix attn_implementation for output_attentions
4eb1b7da
HuggingFaceDocBuilderDev
qubvel remove setting attention, just raise warning
c7af5528
qubvel qubvel requested a review from zucchini-nlp zucchini-nlp 222 days ago
qubvel improve message
c415b1da
qubvel qubvel changed the title Fix attn_implementation for output_attentions=True Fix warning for output_attentions=True 221 days ago
zucchini-nlp
zucchini-nlp approved these changes on 2025-09-02
qubvel
qubvel commented on 2025-09-02
qubvel Update src/transformers/utils/generic.py
760ff457
qubvel qubvel enabled auto-merge (squash) 221 days ago
qubvel Merge branch 'main' into add-warning-for-attn
fc74b01c
qubvel Merge branch 'main' into add-warning-for-attn
941ee54a
qubvel
qubvel Merge branch 'main' into add-warning-for-attn
5bf89f87
qubvel qubvel merged 55676d7d into main 220 days ago
vasqu
vasqu commented on 2025-09-03

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone