pytorch
80bdfd64 - Skip Bfloat16 support when building for VSX (#61630)

Commit
4 years ago
Skip Bfloat16 support when building for VSX (#61630) Summary: Copy-paste ifdef guard from vec256/vec256.h Probably fixes https://github.com/pytorch/pytorch/issues/61575 Pull Request resolved: https://github.com/pytorch/pytorch/pull/61630 Reviewed By: janeyx99 Differential Revision: D29690676 Pulled By: malfet fbshipit-source-id: f6d91eadab74bcbcb1dc9854ae1b98a0dccacd14
Author
Parents
Loading