transformers
fix `processing_utils.py`: avoid deepcopying tokenizer in `ProcessorMixin` to improve performance
#44894
Merged

fix `processing_utils.py`: avoid deepcopying tokenizer in `ProcessorMixin` to improve performance #44894

ydshieh merged 1 commit into main from debug_processor
ydshieh
HuggingFaceDocBuilderDev
ydshieh
ydshieh ydshieh changed the title fix(processing_utils): avoid deepcopying tokenizer in ProcessorMixin.… fix `processing_utils.py`: avoid deepcopying tokenizer in `ProcessorMixin` to improve performance 8 days ago
ydshieh fix(processing_utils): avoid deepcopying tokenizer in ProcessorMixin.…
0567834c
ydshieh ydshieh force pushed from 7ef565f4 to 0567834c 8 days ago
ydshieh ydshieh requested a review from ArthurZucker ArthurZucker 6 days ago
ydshieh ydshieh requested a review from itazap itazap 6 days ago
ArthurZucker
ArthurZucker approved these changes on 2026-03-23
tarekziade
tarekziade approved these changes on 2026-03-23
ydshieh ydshieh merged a8683756 into main 6 days ago
ydshieh ydshieh deleted the debug_processor branch 6 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone