transformers
Force use_cache to be False in PyTorch
#15385
Merged

Force use_cache to be False in PyTorch #15385

ydshieh
HuggingFaceDocBuilder
ydshieh ydshieh changed the title Remove inputs["use_cache"] = False Remove use_cache=False in TFxxxForForConditionalGeneration 3 years ago
ydshieh ydshieh changed the title Remove use_cache=False in TFxxxForForConditionalGeneration Remove use_cache=False in TFxxxForConditionalGeneration 3 years ago
LysandreJik LysandreJik requested a review from patrickvonplaten patrickvonplaten 3 years ago
LysandreJik LysandreJik requested a review from patil-suraj patil-suraj 3 years ago
patrickvonplaten
patrickvonplaten
ydshieh
gante
ydshieh
ydshieh
gante
ydshieh use_cache = False for PT models if labels is passed
43c8452a
ydshieh ydshieh force pushed to 43c8452a 3 years ago
ydshieh Fix for BigBirdPegasusForConditionalGeneration
aee4e7aa
ydshieh
ydshieh
patrickvonplaten
ydshieh
ydshieh ydshieh changed the title Remove use_cache=False in TFxxxForConditionalGeneration Force use_cache to be False in PyTorch 3 years ago
patrickvonplaten
ydshieh add warning if users specify use_cache=True
2c04ac41
ydshieh
patrickvonplaten
patrickvonplaten commented on 2022-02-07
ydshieh Use logger.warning instead of warnings.warn
26f17e84
patil-suraj
patil-suraj approved these changes on 2022-02-08
patrickvonplaten
patrickvonplaten patrickvonplaten merged 6a5472a8 into master 3 years ago
ydshieh ydshieh deleted the fix_missing_cache_in_tf_models branch 3 years ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone