[dynamo] Raise accumulated cache size limit (#122130)
Fixes #114511
This was raised by IBM folks where the a LLM compile was failing because it had more than 64 layers.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/122130
Approved by: https://github.com/Chillee, https://github.com/jansel
ghstack dependencies: #121954, #122005