transformers
438324c9 - Gaudi: Add the bf16 support for hpu (#37568)

Commit
364 days ago
Gaudi: Add the bf16 support for hpu (#37568) * Fix: hpu can support the bf16 Signed-off-by: yuanwu <yuan.wu@intel.com> * hpu is not integrated into torch. Signed-off-by: yuanwu <yuan.wu@intel.com> * Gaudi1 cannot support bf16 Signed-off-by: yuanwu <yuan.wu@intel.com> * Update src/transformers/utils/import_utils.py Co-authored-by: Ilyas Moutawwakil <57442720+IlyasMoutawwakil@users.noreply.github.com> --------- Signed-off-by: yuanwu <yuan.wu@intel.com> Co-authored-by: Ilyas Moutawwakil <57442720+IlyasMoutawwakil@users.noreply.github.com>
Author
Parents
Loading