transformers
[VLMs] fix flash-attention tests
#37603
Merged

[VLMs] fix flash-attention tests #37603

zucchini-nlp merged 7 commits into huggingface:main from zucchini-nlp:janus
zucchini-nlp
zucchini-nlp fix one test
8ef6a2b8
github-actions github-actions marked this pull request as draft 242 days ago
github-actions
zucchini-nlp zucchini-nlp marked this pull request as ready for review 242 days ago
zucchini-nlp Merge branch 'main' into janus
53cb846f
zucchini-nlp
github-actions
HuggingFaceDocBuilderDev
zucchini-nlp fa2 ln test
de9e02ad
zucchini-nlp zucchini-nlp changed the title [janus] fix tests [VLMs] fix flash-attention tests 242 days ago
zucchini-nlp zucchini-nlp requested a review from SunMarc SunMarc 242 days ago
SunMarc
SunMarc approved these changes on 2025-04-18
zucchini-nlp remove keys from config recursively
62518f9c
zucchini-nlp Merge branch 'main' into janus
c4c9b968
zucchini-nlp fix
a19c4a93
zucchini-nlp fixup
0f0c1cdd
zucchini-nlp zucchini-nlp merged 1cfcbfca into main 236 days ago
BenjaminBossan
SunMarc
SunMarc commented on 2025-04-25

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone