pytorch
b7682d35 - [SR] Refactor memory planner to prepare for new algorithm (#74730)

Commit
2 years ago
[SR] Refactor memory planner to prepare for new algorithm (#74730) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/74730 Motivation: I am working on implementing a new, more efficient memory planning algorithm. This algorithm cannot replace the old one entirely, because it can only be practically done for models that have sample inputs to warm up with. We need a way to make the memory planner's strategy extensible. My first pass attempt at implementing the new algorithm crammed everything into the same class, but it became a nightmare to manage (a ton of `if (use_new_strategy)` statements everywhere). Additionally, it was a little clumsy since there are some concepts that make sense for one algorithm but not the other (like `StorageGroup`). It's much cleaner if we instead turn `MemoryPlanner` into an abstract base class and have different subclasses implement their strategies in `allocateManagedTensors` and `deallocateManagedTensors`. ghstack-source-id: 153288210 Test Plan: Existing unit tests Reviewed By: navahgar, hlu1 Differential Revision: D35132124 fbshipit-source-id: c5ef5ae6361b44dedf97090201e244a76e1e6bce (cherry picked from commit c96f6827c8db88f28c4eb379865ad208beae2034)
Author
Mike Iovine
Committer
Parents
Loading