transformers
5f6c080b - RT-DETR parameterized batchnorm freezing (#32631)

Commit
1 year ago
RT-DETR parameterized batchnorm freezing (#32631) * fix: Parameterized norm freezing For the R18 model, the authors don't freeze norms in the backbone. * Update src/transformers/models/rt_detr/configuration_rt_detr.py Co-authored-by: Pavel Iakubovskii <qubvel@gmail.com> --------- Co-authored-by: Pavel Iakubovskii <qubvel@gmail.com>
Author
Parents
Loading