transformers
Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model
#37960
Merged

Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model #37960

uminaty
Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model
c67c173c
github-actions github-actions marked this pull request as draft 275 days ago
github-actions
uminaty uminaty marked this pull request as ready for review 275 days ago
github-actions github-actions requested a review from ArthurZucker ArthurZucker 275 days ago
github-actions github-actions requested a review from zucchini-nlp zucchini-nlp 275 days ago
Fix invalid operand type
54b71c24
uminaty uminaty force pushed from 7c43f751 to 54b71c24 275 days ago
Allow image_sizes to be optional in forward pass to fit tests
0d7a1b7e
Disallow using sdpa with output_attentions
71827ac0
uminaty uminaty force pushed from 20f777d4 to 71827ac0 275 days ago
qubvel
qubvel commented on 2025-05-05
Delete useless comments, use eager attention from smolvlm, use patter…
9e78ceef
uminaty
ArthurZucker
ArthurZucker approved these changes on 2025-05-06
uminaty uminaty force pushed from 9e78ceef to 50cc674b 274 days ago
zucchini-nlp
zucchini-nlp commented on 2025-05-06
add _supports_attention_backend
9503c773
uminaty uminaty force pushed from 50cc674b to 9503c773 274 days ago
use kwargs instead of position_ids
7457672b
HuggingFaceDocBuilderDev
uminaty
ArthurZucker ArthurZucker merged f6664ee7 into main 272 days ago
ArthurZucker
uminaty uminaty deleted the pixtral-all-attn branch 272 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone