transformers
a8683756 - fix `processing_utils.py`: avoid deepcopying tokenizer in `ProcessorMixin` to improve performance (#44894)

Commit
1 day ago
fix `processing_utils.py`: avoid deepcopying tokenizer in `ProcessorMixin` to improve performance (#44894) fix(processing_utils): avoid deepcopying tokenizer in ProcessorMixin.to_dict to improve performance Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
Author
Parents
Loading