pytorch
1db0f735 - [Profiler] Account for caching when assigning IDs (#88917)

Commit
2 years ago
[Profiler] Account for caching when assigning IDs (#88917) The python tracer caches information about module and optimizer state. That means that for subsequent calls, the presence of a Tensor in these fields does not imply that the Tensor is still live; just that it was live during the first call. (I should perhaps rename the fields to something like `stale_parameters` to convey this.) Unless we discard subsequent calls ID assignment get tripped up when it see's a Tensor that was already released. Differential Revision: [D41226827](https://our.internmc.facebook.com/intern/diff/D41226827/) Pull Request resolved: https://github.com/pytorch/pytorch/pull/88917 Approved by: https://github.com/chaekit
Author
Taylor Robie
Committer
Parents
Loading