pytorch
b185359f - Avoid clone for sparse tensors during accumulation of grads. (#33427)

Commit
4 years ago
Avoid clone for sparse tensors during accumulation of grads. (#33427) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/33427 This PR is an attempt to avoid clone for sparse tensors similar to how we avoid clone for dense tensors currently. As per my understanding even if the 'indices' and 'values' of a sparse tensor are non-continguous, operations like 'add' are still supported. As a result, the major change in this PR is to use create a shallow copy instead of clone() for sparse tensors. ghstack-source-id: 99838375 Test Plan: waitforbuildbot Differential Revision: D19926698 fbshipit-source-id: b5a3f36c2aa273e17f8b7a9f09c1ea00e7478109
Author
Parents
Loading