transformers
e2a6445e - Tokenizer fast warnings (#2922)

Commit
5 years ago
Tokenizer fast warnings (#2922) * Remove warning when pad_to_max_length is not set. Signed-off-by: Morgan Funtowicz <morgan@huggingface.co> * Move RoberTa warning to RoberTa and not GPT2 base tokenizer. Signed-off-by: Morgan Funtowicz <morgan@huggingface.co>
Author
Parents
Loading