pytorch
ffbd13b6 - Fix for swap_custom_module_to_observer doing duplicate swaps on the same node.target (#91905)

Commit
1 year ago
Fix for swap_custom_module_to_observer doing duplicate swaps on the same node.target (#91905) Summary: This is a fix for the following issue: "When two nodes in a model have the same dTypes / node.target, the torch quantization prepare_fx flow does not check for duplicates and tries to do a custom module swap twice. When it attempts the swap the same target for a second time, the swap_custom_module_to_observed detects the observed module instead of the float module class on the target, and fails on an assertion. " The added unit test demonstrates a simple example where it fails in absence of this fix. Test Plan: buck test mode/dev //caffe2/test:quantization_fx -- --exact 'caffe2/test:quantization_fx - test_custom_module_class_input_has_duplicate_nodes (quantization.fx.test_quantize_fx.TestQuantizeFx)' Reviewed By: vkuzo Differential Revision: D42023273 Pull Request resolved: https://github.com/pytorch/pytorch/pull/91905 Approved by: https://github.com/jerryzh168
Committer
Parents
Loading