[PyTorch] _addm_activation native function for matmul/bias/activation fusion
Pull Request resolved: https://github.com/pytorch/pytorch/pull/74490
Here's an extended version of addmm that takes advantage of cublasLt's fused addmm + relu/gelu support.
Differential Revision: [D35019612](https://our.internmc.facebook.com/intern/diff/D35019612/)
Approved by: https://github.com/ngimel