transformers
🚨🚨🚨 [`Quantization`] Store the original dtype in the config as a private attribute 🚨🚨🚨
#26761
Merged

🚨🚨🚨 [`Quantization`] Store the original dtype in the config as a private attribute 🚨🚨🚨 #26761

younesbelkada
younesbelkada First step
2797129d
younesbelkada fix
b7e797f3
younesbelkada younesbelkada requested a review from LysandreJik LysandreJik 2 years ago
younesbelkada
younesbelkada commented on 2023-10-12
HuggingFaceDocBuilderDev
ArthurZucker
ArthurZucker commented on 2023-10-13
younesbelkada add adjustements for gptq
73d31090
younesbelkada change to `_pre_quantization_dtype`
28d2b27a
younesbelkada Update src/transformers/modeling_utils.py
316f7766
younesbelkada fix serialization
948de7ed
younesbelkada younesbelkada requested a review from ArthurZucker ArthurZucker 2 years ago
ArthurZucker
ArthurZucker approved these changes on 2023-10-13
younesbelkada Apply suggestions from code review
06700c73
younesbelkada fixup
1ebe674d
younesbelkada younesbelkada changed the title [`Quantization`] Store the original dtype in the config as a private attribute 🚨🚨🚨 [`Quantization`] Store the original dtype in the config as a private attribute 🚨🚨🚨 2 years ago
younesbelkada younesbelkada merged fd6a0ade into main 2 years ago
younesbelkada younesbelkada deleted the add-orig-dtype branch 2 years ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone