Allow passing in batch_size as an argument to distributed trainer (#1276)
Summary:
Pull Request resolved: https://github.com/pytorch/benchmark/pull/1276
Previously there was no way to specify batch size. Now you can provide
args.batch_size.
Test Plan: Imported from OSS
Reviewed By: wconstab
Differential Revision: D40966420
Pulled By: davidberard98
fbshipit-source-id: 23a0fda28c5df0d44ede31b51d5254a38a89c3ae