benchmark
9ade7265 - Fix training hparams in Tacotron2 (#610)

Commit
4 years ago
Fix training hparams in Tacotron2 (#610) Summary: To match the correct batch size of 64 mentioned in the source code and paper. Source: https://arxiv.org/pdf/1712.05884.pdf Code: https://github.com/NVIDIA/tacotron2/blob/bb6761349354ee914909a42208e4820929612069/hparams.py#L84 Also, due to the larger batch size, I needed to add more .wav files (186), since the old dataloader only had 2 .wav files. Pull Request resolved: https://github.com/pytorch/benchmark/pull/610 Reviewed By: xuzhao9 Differential Revision: D32964566 Pulled By: aaronenyeshi fbshipit-source-id: b5d06553d9a70d38fe8ea2197858fe7933109f04
Author
Parents
Loading