pytorch
6964aa2c - backout D33469839 (#71443)

Commit
2 years ago
backout D33469839 (#71443) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/71443 cogwheel test inline_cvr_infer_canary_pyper_model_publish is timing out. The convert_fx call takes > 20 mins for local and local_ro sub modules, which used to take ~ 2 mins. Test Plan: Fblearn flow run * the following cmd took 1113 seconds before the diff and 5002 seconds after. flow-cli clone-locally 320014219 --run-as-secure-group pytorch_at_scale --operators pyper_model_publish_workflow.pyper_model_publish_workflow.process_torch_package_model_files.process_non_sparse_parameters[0] Cogwheel test * Cogwheel test with packages in B3588 (the last good run) took 4694.48s * Cogwheel test with packages in B3590 (the first timeout) took 13975.83s * Cogwheel test with the following packages took 4535.04s * all packages in B3588 except the model publish * the model publish built with D33469839 (https://github.com/pytorch/pytorch/commit/043e84b3d2cb5affee6373ebc95cbafb82fe0017) reversed (created D33633570) Reviewed By: albanD, jerryzh168 Differential Revision: D33633570 fbshipit-source-id: dc5e777c48a90c551641a3f79126461f6a60449e (cherry picked from commit 03ab65023a9f4175584ddac1cca7eab51397c84a)
Author
Committer
Parents
Loading