transformers
dc540dd3 - Adding `handle_long_generation` paramters for `text-generation` pipeline. (#14118)

Commit
4 years ago
Adding `handle_long_generation` paramters for `text-generation` pipeline. (#14118) * Adding `handle_long_generation` paramters for `text-generation` pipeline. * More error handling * Fixing tests by dropping tf support on this functionality, it needs `max_new_tokens` to make it possible to understand user's intent. Otherwise, `max_length` == `tokenizer.model_max_length` < input_ids.shape[0]. * Fixing doc ? * Doc ? * Remove link from doc. * Catched an issue on roberta. * Damn doc. * Non BC proposal ? * Cleaning the fix ? * Finally using only a test override. * Don't need to modify this. * Bad print.
Author
Parents
Loading