Inductor cpp wrapper: support torch.complex64 (#105305)
Add `torch.complex64` into the supported dtype list of cpp wrapper to fix CPU cpp wrapper failure on llama.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/105305
Approved by: https://github.com/jgong5, https://github.com/desertfire, https://github.com/jansel