transformers
2cc8cf6c - Fix `torch.compile` with `fullgraph=True` when `attention_mask` input is used (#29211)

Commit
1 year ago
Fix `torch.compile` with `fullgraph=True` when `attention_mask` input is used (#29211) * fix torch.export.export for llama * do not change doc title * make fix copies
Author
Parents
Loading