[pytorch profiler] Add step tracker logic to handle multiple sources of step increments (#90880)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/90880
# Summary
Enables multiple step trackers. Currently we only had one place to mark that a step() has occurred in the program. This was via pytorch profiler step().
We are now working on adding an Optimizer step hook - https://github.com/pytorch/pytorch/issues/88446
- This could mean programs that already call profiler.step() every iteration can end up double incrementing steps
- If a model uses multiple optimizers we can also have double or more counting of the step.
## Solution
We fix this by adding a layer of abstraction before calling step() to the kineto library. The idea is to maintain steps per requester in a dictionary
```
{
"ProfilerStep": 100, # triggered by profiler step() call
"Optimizer1Step": 100, # Optimizer 1 or 2 are just examples, could be SGD, Adam etc
"Optimizer2Step": 100,
}
```
To figure out the global step count just take max on the dict values (100).
```
{
"ProfilerStep": 100,
"Optimizer1Step": 101, # Optimizer1 got incremented first say
"Optimizer2Step": 100,
}
```
Then global step count is 101
## Calling kineto
We only call the kineto step() function when global count increments.
# Test Plan:
Added a unit test
buck2 run mode/dev-nosan caffe2/test:profiler
Differential Revision: D41751157
Pull Request resolved: https://github.com/pytorch/pytorch/pull/90880
Approved by: https://github.com/chaekit