transformers
55676d7d
- Fix warning for output_attentions=True (#40597)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
213 days ago
Fix warning for output_attentions=True (#40597) * Fix attn_implementation for output_attentions * remove setting attention, just raise warning * improve message * Update src/transformers/utils/generic.py
References
#40597 - Fix warning for output_attentions=True
Author
qubvel
Parents
b67608f5
Loading