add qwen2.5vl #35569

ShuaiBai623
ShuaiBai623177 days ago馃帀 16

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

ShuaiBai623 add qwen2.5vl
e693210a
ShuaiBai623 ShuaiBai623 requested a review from ArthurZucker ArthurZucker 177 days ago
ShuaiBai623 ShuaiBai623 requested a review from qubvel qubvel 177 days ago
ShuaiBai623 ShuaiBai623 requested a review from molbap molbap 177 days ago
ShuaiBai623 ShuaiBai623 requested a review from yonigozlan yonigozlan 177 days ago
ShuaiBai623 ShuaiBai623 requested a review from stevhliu stevhliu 177 days ago
ShuaiBai623 ShuaiBai623 requested a review from Rocketknight1 Rocketknight1 177 days ago
ShuaiBai623 fix
8a713a90
ShuaiBai623 pass check table
be1f811c
minostauros
minostauros requested changes on 2025-01-09
minostauros177 days ago

Brought changes from #35466
Thanks for the new model!

Conversation is marked as resolved
Show resolved
src/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py
1502
1503 def get_rope_index(
1504 self,
1505
input_ids: torch.LongTensor,
minostauros177 days ago
Suggested change
input_ids: torch.LongTensor,
input_ids: Optional[torch.LongTensor] = None,
Conversation is marked as resolved
Show resolved
src/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py
1661 if attention_mask is not None:
1662 position_ids = attention_mask.long().cumsum(-1) - 1
1663 position_ids.masked_fill_(attention_mask == 0, 1)
1664
position_ids = position_ids.unsqueeze(0).expand(3, -1, -1).to(input_ids.device)
minostauros177 days ago
Suggested change
position_ids = position_ids.unsqueeze(0).expand(3, -1, -1).to(input_ids.device)
position_ids = position_ids.unsqueeze(0).expand(3, -1, -1).to(attention_mask.device)
Conversation is marked as resolved
Show resolved
src/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py
1788 attention_mask = attention_mask.to(inputs_embeds.device)
1789
1790 # if we get 4D attention mask we cannot calculate rope deltas anymore. TODO @raushan fixme
1791
if position_ids is None and input_ids is not None and (attention_mask is None or attention_mask.ndim == 2):
minostauros177 days ago
Suggested change
if position_ids is None and input_ids is not None and (attention_mask is None or attention_mask.ndim == 2):
if position_ids is None and (attention_mask is None or attention_mask.ndim == 2):
molbap
qubvel qubvel added New model
qubvel qubvel added Multimodal
zhangfaen
ShuaiBai623
molbap
ShuaiBai623 add modular file
a666eb02
ShuaiBai623
ShuaiBai623 fix style
a96b32d6
ShuaiBai623 Update src/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py
8191ce9b
ShuaiBai623 Update src/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py
2b70136f
ShuaiBai623 Update src/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py
c23055b7
ShuaiBai623 padd copy check
b1e2f688
molbap
ShuaiBai623 use modular
9d0e553e
ShuaiBai623 ShuaiBai623 requested a review from zucchini-nlp zucchini-nlp 169 days ago
ShuaiBai623
ShuaiBai623 fix
23e6635c
ShuaiBai623 fix
aa0dc95c
ShuaiBai623 fix
05a1ab15
ShuaiBai623 update flashatt2&sdpa support_list
00b05f2d
stevhliu
stevhliu commented on 2025-01-17
stevhliu168 days ago

Thanks :)

ShuaiBai623 Update docs/source/en/_toctree.yml
47eb0694
ShuaiBai623 Update docs/source/en/model_doc/qwen2_5_vl.md
35b9dbb2
ShuaiBai623 Update docs/source/en/model_doc/qwen2_5_vl.md
67a10dae
ShuaiBai623 Update docs/source/en/model_doc/qwen2_5_vl.md
e37ce587
ShuaiBai623 Update docs/source/en/model_doc/qwen2_5_vl.md
fcbb355a
ShuaiBai623 Update src/transformers/models/qwen2_5_vl/modular_qwen2_5_vl.py
7f3c66e7
ShuaiBai623 update config
f5a4c2b5
ShuaiBai623 Merge branch 'main' into add_qwen2.5vl
00f9832f
ShuaiBai623 update
f57827dd
ShuaiBai623 fix hf path
b21bd62d
gewenbin0992 rename Qwen2_5_VLVideosKwargs
18ea4bef
gewenbin0992 gewenbin0992 force pushed to 18ea4bef 166 days ago
gewenbin0992 fix
faae9f48
gewenbin0992 fix
cd3fd21c
gewenbin0992
ArthurZucker
ArthurZucker commented on 2025-01-20
ArthurZucker166 days ago

Super super nice! 馃
Great use of modular, thanks for the various integration tests as well!

gewenbin0992 update
f8c465cb
gewenbin0992 excuted modular
84247a47
gewenbin0992 rollback init
0cd1f621
gewenbin0992 fix
446383fe
gewenbin0992 formated
f87160db
gewenbin0992 simpler init
4012f4aa
gewenbin0992 fix
d671be8c
gewenbin0992 fix
ad46b878
gewenbin0992 fix
8390d99d
gewenbin0992 fix
16913ab0
gewenbin0992 fix
72de3020
gewenbin0992 update docs
91a00e32
gewenbin0992
minostauros
minostauros commented on 2025-01-22
minostauros164 days ago

Again, brought changes from #35466

gewenbin0992 fix
ca2e16cd
gewenbin0992 fix
64db9c7d
gewenbin0992
gewenbin0992 Merge branch 'huggingface:main' into add_qwen2.5vl
bb0c4596
gewenbin0992 Merge branch 'main' into add_qwen2.5vl
df50ed56
gewenbin0992
gewenbin0992 update Qwen2VLRotaryEmbedding for yarn
c84216d7
ArthurZucker
ArthurZucker approved these changes on 2025-01-23
ArthurZucker163 days ago馃槃 1

Thanks only a few comments that are not adressed, we like to split long sequence of code in more line for debugging purpose and readability!

Thanks for your PR 馃

gewenbin0992 Merge branch 'main' into add_qwen2.5vl
87882dbe
gewenbin0992 fix
a0b3a945
ArthurZucker ArthurZucker merged f3f6c865 into main 163 days ago
ArthurZucker
SHYuanBest
mgoin
DiamondGlassDrill
ArthurZucker
Sheharyar-Khan
948024326
ArthurZucker
atanasmatev
DiamondGlassDrill
ArthurZucker
sjmielke
ArthurZucker

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone