transformers
Add warning for missing attention mask when pad tokens are detected to various models
#25345
Merged

Add warning for missing attention mask when pad tokens are detected to various models #25345

hackyon
hackyon Add attention mask and pad token warning to many of the models
cb9afe06
HuggingFaceDocBuilderDev
hackyon
hackyon hackyon marked this pull request as ready for review 2 years ago
ydshieh
ydshieh
hackyon Remove changes under examples/research_projects
f57d8488
hackyon
gante
hackyon hackyon force pushed to f57d8488 2 years ago
hackyon Skip the warning check during torch.fx or JIT tracing
ec9c3bb3
hackyon Switch ordering for the warning and input shape assignment
1916d8e9
hackyon Add missing line break in one of the files
c40467fd
hackyon
ydshieh
ydshieh
ydshieh approved these changes on 2023-08-08
ydshieh ydshieh requested a review from sgugger sgugger 2 years ago
ydshieh
sgugger
sgugger approved these changes on 2023-08-08
ydshieh ydshieh merged 5ea2595e into main 2 years ago
ydshieh
gante
hackyon hackyon deleted the warning-attention-mask-more-models branch 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone