transformers
1ee03164 - fix(testing_utils): guard get_device_capability with torch.cuda.is_available() (#45351)

Commit
1 day ago
fix(testing_utils): guard get_device_capability with torch.cuda.is_available() (#45351) * fix(testing_utils): guard get_device_capability with torch.cuda.is_available() * refactor: restructure CUDA/XPU fallback per review — use separate if blocks Change elif chain to separate if blocks so that when CUDA is installed but no GPU is available, the code falls through to check XPU (and then NPU). Per @remi-or's suggestion in review. Built by Rudrendu Paul, developed with Claude Code * fix: add TODO for NPU is_available guard per remi-or review Add a TODO comment in the IS_NPU_SYSTEM block noting that after torch 2.5.1 we should use if hasattr(torch, "npu") and torch.npu.is_available() for consistency with CUDA/XPU blocks. Built by Rudrendu Paul, developed with Claude Code --------- Co-authored-by: Rudrendu <RudrenduPaul@users.noreply.github.com> Co-authored-by: Rémi Ouazan <83456801+remi-or@users.noreply.github.com>
Author
Parents
Loading