Add warning for missing attention mask when pad tokens are detected to various models #25345
Add attention mask and pad token warning to many of the models
cb9afe06
hackyon
marked this pull request as ready for review 2 years ago
Remove changes under examples/research_projects
f57d8488
hackyon
force pushed
to
f57d8488
2 years ago
Skip the warning check during torch.fx or JIT tracing
ec9c3bb3
Switch ordering for the warning and input shape assignment
1916d8e9
Add missing line break in one of the files
c40467fd
ydshieh
approved these changes
on 2023-08-08
sgugger
approved these changes
on 2023-08-08
ydshieh
merged
5ea2595e
into main 2 years ago
hackyon
deleted the warning-attention-mask-more-models branch 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub