transformers
fix `processing_utils.py`: avoid deepcopying tokenizer in `ProcessorMixin` to improve performance
#44894
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
Commits
fix(processing_utils): avoid deepcopying tokenizer in ProcessorMixin.to_dict to improve performance
ydshieh
committed
16 days ago
Loading