pytorch
be757957 - Support softmax with D == 0 (#29167)

Commit
5 years ago
Support softmax with D == 0 (#29167) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/29167 As titled. This fix is crucial as multi_channel splitting would create history that has no items (i.e., D == 0), which leads to flow failure. Test Plan: Unittest flow test: before fix: f148783160 after fix: f149082299 buck test mode/dev-nosan caffe2/caffe2/python/operator_test:softmax_ops_test Reviewed By: xianjiec Differential Revision: D18296081 fbshipit-source-id: e0bb2dc2c4e5b465e213f31e5c5ced3a7e1fd574
Author
Parents
Loading