benchmark
a7ae1fb3 - Change the llama max_batch_size larger than default eval batch size (#2283)

Commit
1 year ago
Change the llama max_batch_size larger than default eval batch size (#2283) Summary: Fix the PyTorch issue: https://github.com/pytorch/pytorch/issues/106110 for llama dynamic batch test case. The root-cause and discussion is as described in https://github.com/pytorch/pytorch/issues/106110#issuecomment-1950964863. Pull Request resolved: https://github.com/pytorch/benchmark/pull/2283 Reviewed By: aaronenyeshi Differential Revision: D58532244 Pulled By: xuzhao9 fbshipit-source-id: ef2a51974c77e8dcf7e041db10e71869768126a3
Parents
Loading