warn user once for possible unnecessary find_unused_params (#50133)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/50133
`find_unused_parameters=True` is only needed when the model has unused parameters that are not known at model definition time or differ due to control flow.
Unfortunately, many DDP users pass this flag in as `True` even when they do not need it, sometimes as a precaution to mitigate possible errors that may be raised (such as the error we raise with not using all outputs).While this is a larger issue to be fixed in DDP, it would also be useful to warn once if we did not detect unused parameters.
The downside of this is that in the case of flow control models where the first iteration doesn't have unused params but the rest do, this would be a false warning. However, I think the warning's value exceeds this downside.
ghstack-source-id: 119707101
Test Plan: CI
Reviewed By: pritamdamania87
Differential Revision: D25411118
fbshipit-source-id: 9f4a18ad8f45e364eae79b575cb1a9eaea45a86c