Fix TypeError in the e2e hf_t5 (#1655)
Summary:
When run e2e hf_t5, it gives following error:
```
Traceback (most recent call last):
File "/home/ubuntu/pytorch/benchmark/run_e2e.py", line 56, in <module>
result = gen_result(m, run(test))
File "/home/ubuntu/pytorch/benchmark/run_e2e.py", line 18, in run
func()
File "/home/ubuntu/pytorch/benchmark/torchbenchmark/e2e_models/hf_t5/__init__.py", line 387, in eval
generated_tokens = self.accelerator.unwrap_model(self.model).generate(
File "/root/anaconda3/envs/torchbenchmark/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/root/anaconda3/envs/torchbenchmark/lib/python3.10/site-packages/transformers/generation/utils.py", line 1346, in generate
(generation_config.num_beams > 1)
TypeError: '>' not supported between instances of 'NoneType' and 'int'
```
The error is due to `None` value for `self.hf_args.num_beams` (LOC 369). Setting it to 1 as used in general for num_beams default and as used in model_factory.py for the huggingface framework code.
Pull Request resolved: https://github.com/pytorch/benchmark/pull/1655
Reviewed By: aaronenyeshi
Differential Revision: D45957026
Pulled By: xuzhao9
fbshipit-source-id: f6c7535b76b623eee69b1051036334666b181d0b