RT-DETR parameterized batchnorm freezing (#32631)
* fix: Parameterized norm freezing
For the R18 model, the authors don't freeze norms in the backbone.
* Update src/transformers/models/rt_detr/configuration_rt_detr.py
Co-authored-by: Pavel Iakubovskii <qubvel@gmail.com>
---------
Co-authored-by: Pavel Iakubovskii <qubvel@gmail.com>