[quant] Implement forward and backward autograd functions for fake quantize (#81438)
### Summary:
This PR implements custom autograd functions for forward and backward to be used in APoT fake quantization. The implementation follows this doc about custom autograd functions: https://pytorch.org/tutorials/beginner/examples_autograd/polynomial_custom_function.html
### Test Plan:
Run tests with: `python test/quantization/core/experimental/test_fake_quantize.py`
Pull Request resolved: https://github.com/pytorch/pytorch/pull/81438
Approved by: https://github.com/jerryzh168