SKIP llama for dynamic size testing (#135960)
Summary:
Running Torchbench llama with dynamic size failed with
```
File "/localdisk/leslie/torch_inductor_community/pytorch/torch/fx/experimental/symbolic_shapes.py", line 4182, in produce_guards
raise ConstraintViolationError(
torch.fx.experimental.symbolic_shapes.ConstraintViolationError: Constraints violated (L['inputs'][0].size()[0])! For more information, run with TORCH_LOGS="+dynamic".
- Not all values of RelaxedUnspecConstraint(L['inputs'][0].size()[0]) are valid because L['inputs'][0].size()[0] was inferred to be a constant (32).
```
Skip this model for marking dynamic dim.
X-link: https://github.com/pytorch/pytorch/pull/135960
Approved by: https://github.com/ezyang
Reviewed By: jeanschmidt
Differential Revision: D62737135
fbshipit-source-id: 71ef5686e924cfebe0284c986e1cb412b3b499d0