[flatbuffer] Fix forward flatbuffer type handling with dynamic type. (#71500)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/71500
Some places in flatbuffer_loader.cpp need to update to newer API call following the dynamic type changes.
ghstack-source-id: 147278860
Test Plan:
rebase D33665961
```
[zhxchen17@devbig560.ftw3 /data/users/zhxchen17/fbsource] buck run fbcode/mode/dbg //arvr/firmware/silicon/turing:test_torch -c turing.min_runtime=1 -c turing.dsp_op=1 -c turing.model_file=test1.ptl -c pt.has_backtraces=1
Action graph will be rebuilt because files have been added or removed.
Downloaded 0/4 artifacts, 0.00 bytes, 100.0% cache miss (for updated rules)
Building: finished in 6.1 sec (100%) 253/253 jobs, 3/253 updated
Total time: 6.1 sec
BUILD SUCCEEDED
Conv: input [1, 32, 4, 4] residuals [1] weights [4, 4, 1, 1, 2, 32] nlu_params [4, 128] in_ch 32 out_ch 32 groups 1 kernel stride padding upsample 0 op_type 0 act_type 0
```
Reviewed By: qihqi
Differential Revision: D33668588
fbshipit-source-id: 44163c1bc0ea57e4bd265384a253d6cc7b96ed4a
(cherry picked from commit 746487075e36fe90317b631cb3a839d16fd0723f)