pytorch
0b3f42fa - [PyTorch Edge] Add test for lite interpreter operator caching (#62306)

Commit
3 years ago
[PyTorch Edge] Add test for lite interpreter operator caching (#62306) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/62306 Test to see if caching of operators works as expected. When caching operators during model load we look up using the operator name. This test ensures that even if there are multiple operators with the same name (in the same model), the caching distinguishes between the ones that have a different number of arguments specified during the call in the serialized bytecode. In this specific test, there's a model with 3 methods, 2 of which return a `float32` tensor and one which return an `int64` dtype. Please see the comments in the diff for details. ghstack-source-id: 134634613 Test Plan: Test command: ``` cd fbsource/fbcode/ buck test mode/dev //caffe2/test/cpp/jit:jit -- --exact 'caffe2/test/cpp/jit:jit - LiteInterpreterTest.OperatorCacheDifferentiatesDefaultArgs' ``` ``` cd fbsource/ buck test xplat/caffe2:test_lite_interpreter ``` Reviewed By: raziel Differential Revision: D29929116 fbshipit-source-id: 1d42bd3e6d33128631e970c477344564b0337325
Author
Parents
Loading