pytorch
8fc687f7 - Add activation functions (ReLU and SiLU for now) for structured sparse linear operator (#101339)

Commit
1 year ago
Add activation functions (ReLU and SiLU for now) for structured sparse linear operator (#101339) Differential Revision: [D46453476](https://our.internmc.facebook.com/intern/diff/D46453476) Pull Request resolved: https://github.com/pytorch/pytorch/pull/101339 Approved by: https://github.com/cpuhrsch
Author
Committer
Parents
Loading