accelerate
937e08ce
- add bf16 mixed precision support for NPU (#1949)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
add bf16 mixed precision support for NPU (#1949) * add bf16 mixed precision support for NPU * Explicitly register the NPU backend to PyTorch via `import torch_npu` --------- Co-authored-by: statelesshz <jihuazhong1@huawei.com>
References
#1949 - add bf16 mixed precision support for NPU
Author
ji-huazhong
Parents
5d558f21
Loading