[SigridHashOp] Fix converter (#34836)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/34836
Once SigridHashOp argument is supplied, I realized the shape inference is still wrong because the argument is not supplied in the debug_ssa. Thanks to yinghai, I didn't fix the converter, fixing it in this diff
Test Plan:
Run the binary, and checked the exported op
op {
input: "sequential_250/parallel/normalization/dper_feature_normalization/sparse_features_processor/sparse_feature_transform/gather_ranges_GSF_IDLIST_COOCCUR_APP_ID_NEKO_ORGANIC_1D_7D_INSTALL_V1/gathered_values_0"
output: "sequential_250/parallel/normalization/dper_feature_normalization/sparse_features_processor/sparse_feature_transform/sequential_1/hash_feature_ids/SigridHash:0_0"
type: "SigridHash"
arg {
name: "salt"
i: 0
}
arg {
name: "maxValue"
i: 100000
}
arg {
name: "hashIntoInt32"
i: 1
}
arg {
name: "net_pos"
i: 3
}
}
it now have hashIntInt32
Reviewed By: yinghai
Differential Revision: D20457057
fbshipit-source-id: 023ade5e66df82037a8f2da3174383dda8aff230