pytorch
cbe5ab11 - Make JIT Serialization support arbitrary std::function<> IO (#27586)

Commit
5 years ago
Make JIT Serialization support arbitrary std::function<> IO (#27586) Summary: Right now, torch::save() uses std::ostream, which results in unnecessary data copies in practice. Similar for torch::load(). Adding a std::function<size_t(const void*, size_t)> as an output option, parallel to the existing filename and std::ostream apis, gives users the flexibility to emit directly to a backing store. For a simple case of appending the output to a std::string, we observe significant benchmark savings (on order of -50%), even with the minor std::function<> dispatch overhead. The main reason is that std::ostringstream effectively requires 2 extra copies of the data beyond a simple string.append lambda. We also provide a parallel api for the load(), though this one is slightly more complex due to the need to do arbitrary position reads. Pull Request resolved: https://github.com/pytorch/pytorch/pull/27586 Test Plan: buck test mode/dev-nosan caffe2/test/... (Basic serialization test in caffe2/test/cpp/api/serialize.cpp) Benchmark in experimental/jeremyl/c2/SerializationBench.cpp, with D17823443 (1M time goes from 90ms -> 40ms, albeit with crc patch applied) Differential Revision: D17822962 Pulled By: jjlilley fbshipit-source-id: d344a7e59707f3b30d42280fbab78f87399e4d10
Author
Parents
Loading