transformers
Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model
#37960
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
7
Changes
View On
GitHub
Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model
#37960
ArthurZucker
merged 7 commits into
huggingface:main
from
uminaty:pixtral-all-attn
Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model
c67c173c
github-actions
marked this pull request as draft
275 days ago
uminaty
marked this pull request as ready for review
275 days ago
github-actions
requested a review
from
ArthurZucker
275 days ago
github-actions
requested a review
from
zucchini-nlp
275 days ago
Fix invalid operand type
54b71c24
uminaty
force pushed
from
7c43f751
to
54b71c24
275 days ago
Allow image_sizes to be optional in forward pass to fit tests
0d7a1b7e
Disallow using sdpa with output_attentions
71827ac0
uminaty
force pushed
from
20f777d4
to
71827ac0
275 days ago
qubvel
commented on 2025-05-05
Delete useless comments, use eager attention from smolvlm, use patter…
9e78ceef
ArthurZucker
approved these changes on 2025-05-06
uminaty
force pushed
from
9e78ceef
to
50cc674b
274 days ago
zucchini-nlp
commented on 2025-05-06
add _supports_attention_backend
9503c773
uminaty
force pushed
from
50cc674b
to
9503c773
274 days ago
use kwargs instead of position_ids
7457672b
ArthurZucker
merged
f6664ee7
into main
272 days ago
uminaty
deleted the pixtral-all-attn branch
272 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
ArthurZucker
zucchini-nlp
qubvel
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub