benchmark
536969ca - Enable fp16 by default for hf models (#782)

Commit
4 years ago
Enable fp16 by default for hf models (#782) Summary: This PR sets `fp16` to be the default precision of all huggingface models. This PR also includes an extra patch to the transformers package, because `hf_BigBird` needs to be patched in order to support fp16. I believe this patch should also be upstream-ed: https://github.com/huggingface/transformers/pull/16034 Pull Request resolved: https://github.com/pytorch/benchmark/pull/782 Reviewed By: frank-wei Differential Revision: D34803502 Pulled By: xuzhao9 fbshipit-source-id: 3d46f7983aa32333b12af605f69e45f1fe3134d7
Author
Parents
Loading