onnxruntime
bbd38508 - [QNN EP] Support quantized BatchNorm with per-channel DQ params on QNN HTP (#26959)

Commit
5 days ago
[QNN EP] Support quantized BatchNorm with per-channel DQ params on QNN HTP (#26959) ## Motivation: QNN HTP was rejecting quantized BatchNorm models where parameters (scale, mean, var) come through DequantizeLinear nodes with per-channel INT8 quantization. This pattern is common in quantized models from quantization tools. ## Changes: - Helpers to resolve BatchNorm params through DQ nodes to their underlying initializers - Support per-channel dequantization for BatchNorm parameters - Support input datatype of UFIXED_POINT_16 - Add unit test covering this QDQ params configuration
Author
Parents
Loading