pytorch
37c85cf5 - Add warning if tensor cores are not used (#88844)

Commit
2 years ago
Add warning if tensor cores are not used (#88844) Fixes https://github.com/pytorch/torchdynamo/issues/1839 Should I do this for all backends or just inductor? ## Test On a V100 I got from AWS ```python from torch._dynamo import optimize import torch def fn(x, y): a = torch.cos(x) b = torch.sin(y) return a + b new_fn = optimize("inductor")(fn) a = new_fn(torch.Tensor(1),torch.Tensor(1)) print(a) ``` ## New logs ``` (sourcetorch) ubuntu@ip-172-31-31-152:~/test$ python test.py /home/ubuntu/pytorch/torch/_dynamo/eval_frame.py:318: UserWarning: Tensor cores are available but not enabled. Consider setting torch.backends.cuda.matmul.allow_tf32 == True in your python script for speedups warnings.warn( tensor([1.3717]) ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/88844 Approved by: https://github.com/ngimel, https://github.com/mlazos, https://github.com/anijain2305
Author
Committer
Parents
Loading