Add utility for bundling sample inputs with models (#35631)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/35631
Bundling sample inputs with our models with a standardized interface
will make it possible to write benchmarking and code-coverage tools that
call all models in a uniform way. The intent is to make this a standard
for mobile models within Facebook. Putting it in torch/utils so tests
can run on GitHub and because it might be useful for others as well.
`augment_model_with_bundled_inputs` is the primary entry point. See
its docstring for usage information and the test for some example uses.
One design question I had was how much power should be available for
automatic deflating and inflating of inputs. The current scheme gives
some automatic handling and a reasonable escape hatch
("_bundled_input_inflate_format") for top-level tensor arguments, but no
automatic support for (e.g.) tensors in tuples or long strings. For
more complex cases, we have the ultimate escape hatch of just defining
_generate_bundled_inputs in the model.
Another design question was whether to add the inputs to the model or
wrap the model in a wrapper module that had these methods and delegated
calls to `forward`. Because models can have other exposed methods and
attributes, the wrapped seemed too onerous.
Test Plan: Unit test.
Differential Revision: D20925013
Pulled By: dreiss
fbshipit-source-id: 4dbbb4cce41e5752133b4ecdb05e1c92bac6b2d5