pytorch
39e8d71d - Use a ptr to store autograd profiler rng (#24889)

Commit
5 years ago
Use a ptr to store autograd profiler rng (#24889) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/24889 Trying to fix #2575. [Here](https://gist.github.com/suo/7b0bc4b49d3c9e095b9f7eef8fa7c6e8) is all TLS in libtorch.so (thanks ezyang for figuring how to find this) I noticed that `CallbackManager::sample_zero_one()::gen` has size 5000, which seems bigger than the other ones. So make it heap-allocated instead. Caveat: I have no idea if this will actually fix anything, or whether making this variable heap-allocated is a bad idea. Test Plan: Imported from OSS Differential Revision: D16912540 Pulled By: suo fbshipit-source-id: 71eb0391bf4c6e85b090f8650a2fbfc2107f2707
Author
suo suo
Parents
Loading