benchmark
7f33d625 - Fix the batch size for maml_omniglot and speech_transformer (#1569)

Commit
2 years ago
Fix the batch size for maml_omniglot and speech_transformer (#1569) Summary: The default batch size for maml_omniglot should be the number of tasks, which is 5. Disable the batch size customization of speech_transformer because of upstream issue. Fixes https://github.com/pytorch/benchmark/issues/1561 and https://github.com/pytorch/benchmark/issues/1560 Pull Request resolved: https://github.com/pytorch/benchmark/pull/1569 Reviewed By: aaronenyeshi Differential Revision: D45230782 Pulled By: xuzhao9 fbshipit-source-id: 41028e35d848cac8684fd89b3c35094a6a23ed74
Author
Parents
Loading