transformers
[`ProcessingIdefics`] Attention mask bug with padding
#29449
Merged

[`ProcessingIdefics`] Attention mask bug with padding #29449

byi8220
byi8220 Defaulted IdeficsProcessor padding to 'longest', removed manual padding
43119af0
byi8220 make fixup
83266818
ArthurZucker
amyeroberts
amyeroberts commented on 2024-03-08
byi8220 Defaulted processor call to padding=False
67287709
byi8220 Add padding to processor call in IdeficsModelIntegrationTest as well
bebd7c18
byi8220 Defaulted IdeficsProcessor padding to 'longest', removed manual padding
8d8cc2f8
byi8220 make fixup
26571b22
byi8220 Defaulted processor call to padding=False
72d9e00d
byi8220 Add padding to processor call in IdeficsModelIntegrationTest as well
86752113
byi8220 Merge branch 'huggingface:main' into attention-mask-bug-with-padding
9014e8b4
byi8220 Merge branch 'huggingface:main' into attention-mask-bug-with-padding
1be6fffe
byi8220 Merge branch 'attention-mask-bug-with-padding' of https://github.com/…
ba647eaf
ArthurZucker ArthurZucker changed the title Attention mask bug with padding [`ProcessingIdefics`] Attention mask bug with padding 1 year ago
byi8220 redefaulted padding=longest again
5e27c2c5
byi8220 fixup/doc
a597d6dc
byi8220 Merge branch 'huggingface:main' into attention-mask-bug-with-padding
8a627227
byi8220 Merge branch 'huggingface:main' into attention-mask-bug-with-padding
ba2f2da3
byi8220 Merge branch 'huggingface:main' into attention-mask-bug-with-padding
f79ea7f6
amyeroberts
amyeroberts approved these changes on 2024-04-04
amyeroberts amyeroberts merged 75b76a5e into main 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone