transformers
f6664ee7 - Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model (#37960)

Commit
223 days ago
Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model (#37960) * Add ALL_ATTENTION_FUNCTIONS compatibility for Pixtral model * Fix invalid operand type * Allow image_sizes to be optional in forward pass to fit tests Disallow using sdpa and output_attentions * Disallow using sdpa with output_attentions * Delete useless comments, use eager attention from smolvlm, use pattern from mistral * add _supports_attention_backend * use kwargs instead of position_ids --------- Co-authored-by: aurelien.lac <aurelien.lac@lighton.ai>
Author
Parents
Loading