pytorch
f40777e4 - [Dynamo] Fix guard bug when np.float used in control flow (#91991)

Commit
3 years ago
[Dynamo] Fix guard bug when np.float used in control flow (#91991) Fixes 14k github models: https://github.com/jansel/pytorch-jit-paritybench/blob/master/generated/test_Sanster_lama_cleaner.py#L2392 Error ``` File "/scratch/ybliang/work/repos/pytorch/torch/_dynamo/guards.py", line 263, in CONSTANT_MATCH self.EQUALS_MATCH(guard) File "/scratch/ybliang/work/repos/pytorch/torch/_dynamo/guards.py", line 197, in EQUALS_MATCH assert istype( AssertionError: float64 ``` ```np.float``` is unspecialized by default, which has guard on ```TYPE_MATCH```. However, it will be baked when being used in control flow, which has guard on ```EQUALS_MATCH```. We should make ```EQUALS_MATCH``` support ```np.float```. Pull Request resolved: https://github.com/pytorch/pytorch/pull/91991 Approved by: https://github.com/jansel
Author
Committer
Parents
Loading