transformers
6a5472a8 - Force use_cache to be False in PyTorch (#15385)

Commit
3 years ago
Force use_cache to be False in PyTorch (#15385) * use_cache = False for PT models if labels is passed * Fix for BigBirdPegasusForConditionalGeneration * add warning if users specify use_cache=True * Use logger.warning instead of warnings.warn Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
Author
Parents
Loading