transformers
55676d7d - Fix warning for output_attentions=True (#40597)

Commit
213 days ago
Fix warning for output_attentions=True (#40597) * Fix attn_implementation for output_attentions * remove setting attention, just raise warning * improve message * Update src/transformers/utils/generic.py
Author
Parents
Loading