pytorch
c5a1f043 - Enabled BFloat16 support for cumsum, logcumsumexp, cumprod, cummin & cummax on CUDA (#57904)

Commit
3 years ago
Enabled BFloat16 support for cumsum, logcumsumexp, cumprod, cummin & cummax on CUDA (#57904) Summary: Enabled BFloat16 support for `cumsum`, `logcumsumexp`, `cumprod`, `cummin` & `cummax` on CUDA Pull Request resolved: https://github.com/pytorch/pytorch/pull/57904 Reviewed By: ailzhang Differential Revision: D28558722 Pulled By: ngimel fbshipit-source-id: 2a8e49c271e968f841d24534b6cc7be162d3a5aa
Parents
Loading