transformers
2cc8cf6c - Fix `torch.compile` with `fullgraph=True` when `attention_mask` input is used (#29211)

Loading