transformers
Fix gemma4 has flash-attention incompatbile head-dim=512
#45202
Open

Fix gemma4 has flash-attention incompatbile head-dim=512 #45202

Qubitium wants to merge 9 commits into huggingface:main from Qubitium:gemma4-fa-fix
Qubitium
Qubitium fix gemma4 has flash-attention incompatbile head-dim=512
cdf601b7
Qubitium Merge branch 'main' into gemma4-fa-fix
128f8e4e
Qubitium remove head-dim override
3919a91c
Qubitium
Qubitium commented on 2026-04-02
Qubitium
Qubitium fix gemma4 tests
f7f9ea00
Qubitium cleanup
5bc8fdb4
Qubitium fix ci failing
66d2c0cb
Qubitium
Qubitium commented on 2026-04-03
Qubitium
Qubitium commented on 2026-04-03
Qubitium
Qubitium Merge branch 'main' into gemma4-fa-fix
94793562
Qubitium Qubitium changed the title fix gemma4 has flash-attention incompatbile head-dim=512 Fix gemma4 has flash-attention incompatbile head-dim=512 6 days ago
vasqu
vasqu
Qubitium
vasqu
Qubitium
vasqu
Qubitium
Cyrilvallez
Cyrilvallez commented on 2026-04-10
Qubitium revert test changes
2903f030
Qubitium Merge remote-tracking branch 'origin/main' into gemma4-fa-fix
e18fe4ab
github-actions
Qubitium

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone