[export] fakify module state in nonstrict (#119297)
Summary:
Previously, we were not fakifying module state explicitly in the nonstrict path.
This led to errors when modules were constructed under a fake mode, since the user-provided fake mode was clashing with the one that we had constructed internally to fakify the inputs.
This fixes things to use a single fake mode for everything.
As a side effect, this raised the question of how we ought to serialize state_dicts/constants that might be fake tensors. Naively calling torch.save understandably explodes—so this diff piggybacks on our infra for doing this on meta["val"]. Open to revising this, I'm low confidence that it's the best way to do it.
Test Plan: unit tests
Differential Revision: D53484942
Pull Request resolved: https://github.com/pytorch/pytorch/pull/119297
Approved by: https://github.com/tugsbayasgalan