benchmark
7fe795a5 - use smaller batch size for timm_efficientdet in inference (#113095)

Commit
2 years ago
use smaller batch size for timm_efficientdet in inference (#113095) Summary: Previously had OOMs X-link: https://github.com/pytorch/pytorch/pull/113095 Approved by: https://github.com/xmfan ghstack dependencies: #112650 Reviewed By: PaliC Differential Revision: D51101668 Pulled By: eellison fbshipit-source-id: 73977261c1f64fa9cb96ea5ca89d168cb510a8f1
Author
Parents
Loading