[inductor][Optimus]Improve logging for Optimus (#110186)
Summary: It is based on the diff D49340843. We add more logs for better debug and logging purposes.
Test Plan:
```
[2023-09-27 20:35:53,844] [0/0] torch._inductor.fx_passes.group_batch_fusion: [INFO] Before group_batch fusion in pre grads pass. Print graph: https://www.internalfb.com/intern/everpaste/?color=0&handle=GEoA8xb22jibUNEEAPYecF9_RVM1br0LAAAz
[2023-09-27 20:35:55,001] [0/0] torch._inductor.fx_passes.group_batch_fusion: [INFO] Apply fusion BatchLinearFusion. Print graph: https://www.internalfb.com/intern/everpaste/?color=0&handle=GPMR9BYffjwToEQCAFS7rgixMi0pbr0LAAAz
[2023-09-27 20:35:57,419] [0/0] torch._inductor.fx_passes.group_batch_fusion: [INFO] Apply fusion BatchLinearLHSFusion. Print graph: https://www.internalfb.com/intern/everpaste/?color=0&handle=GKiA8hNycGpBdAIDAOn0c1Hpef4sbr0LAAAz
[2023-09-27 20:35:57,585] [0/0] torch._inductor.fx_passes.group_batch_fusion: [INFO] BatchLayernormFusion: key = ('batch_layernorm', 'torch.Size([2048, 128])', 'torch.Size([128])', 'torch.Size([128])', '(128,)', '1e-05'); subset size = 7
[2023-09-27 20:35:58,493] [0/0] torch._inductor.fx_passes.group_batch_fusion: [INFO] Apply fusion BatchLayernormFusion. Print graph: https://www.internalfb.com/intern/everpaste/?color=0&handle=GKpftRa9Glxm-MYDAOZb_D80JHsYbr0LAAAz
[2023-09-27 20:35:59,754] [0/0] torch._inductor.fx_passes.group_batch_fusion: [INFO] Apply fusion BatchTanhFusion. Print graph: https://www.internalfb.com/intern/everpaste/?color=0&handle=GPgh9BZQl4EKGckAAES094iV3Atrbr0LAAAz
I0927 20:36:00.532000 3750607 pre_grad.py:71] After group_batch_fusion_pre_grad_passes: https://www.internalfb.com/intern/everpaste/?color=0&handle=GBPb8xYxfrbXuCMDAI5d_a4YyhFBbr0LAAAz
```
Differential Revision: D49710166
Pull Request resolved: https://github.com/pytorch/pytorch/pull/110186
Approved by: https://github.com/jackiexu1992, https://github.com/yanboliang