pytorch
a7c8c56b - torchdeploy allow embedded cuda interp use without cuda (#59459)

Commit
3 years ago
torchdeploy allow embedded cuda interp use without cuda (#59459) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/59459 For any binary that can be used both with and without cuda, it's better to allow just including the cuda flavor of the interpreter. The previous logic would fail in this case, as it only allows using the cuda flavor if torch::cuda::is_available() reports true. Now, we unconditionally allow the cuda flavor to be used if it's present. Test Plan: Added new unit test to exercise this scenario, ran locally on devvm without cuda. Reviewed By: dzhulgakov Differential Revision: D28902176 fbshipit-source-id: 5c7c90d84987848471bb6dd5318db15314e0b442
Author
Parents
Loading