pytorch
de73f9a5 - Add forward AD support for logsumexp, log_softmax, softmax, nll_loss, and cross_entropy (#73741)

Commit
2 years ago
Add forward AD support for logsumexp, log_softmax, softmax, nll_loss, and cross_entropy (#73741) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/73741 There are probably more perf improvements that can be made, for example reusing more quantities from forward, doing more things inplace, but in the spirit of improving coverage, this is probably OK for now. Note: I didn't do anything with half_to_float, but CUDA (locally) hasn't complained yet Test Plan: Imported from OSS Reviewed By: ejguan Differential Revision: D34690141 Pulled By: soulitzer fbshipit-source-id: fe934e191fee2c8e956d7a5f4b553923adf1b33f (cherry picked from commit ae49aff7f7c8496e04a3ce7667d8f068ca0a52ec)
Author
Committer
Parents
Loading