Add activation functions (ReLU and SiLU for now) for structured sparse linear operator (#101339)
Differential Revision: [D46453476](https://our.internmc.facebook.com/intern/diff/D46453476)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/101339
Approved by: https://github.com/cpuhrsch