transformers
Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model
#37960
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
7
Changes
View On
GitHub
Commits
Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model
aurelien.lac
committed
233 days ago
Fix invalid operand type
aurelien.lac
committed
233 days ago
Allow image_sizes to be optional in forward pass to fit tests
aurelien.lac
committed
233 days ago
Disallow using sdpa with output_attentions
aurelien.lac
committed
233 days ago
Delete useless comments, use eager attention from smolvlm, use pattern from mistral
aurelien.lac
committed
233 days ago
add _supports_attention_backend
aurelien.lac
committed
232 days ago
use kwargs instead of position_ids
aurelien.lac
committed
232 days ago
Loading