accelerate
46f1391b - Fix XPU inference (#2383)

Commit
1 year ago
Fix XPU inference (#2383) Though it will complain about "Device xpu is not recognized, available devices are integers(for GPU/XPU), 'mps', 'cpu' and 'disk'", but you cannot just put 0 as device, or it will treat 0 as CUDA device, then complains again that torch is not compiled with CUDA enabled. You will need safetensors >= 0.4.2 if using safetensors files.
Author
Parents
Loading