transformers
Fix `torch.compile` with `fullgraph=True` when `attention_mask` input is used
#29211
Merged

Commits
  • fix torch.export.export for llama
    fxmarty committed 1 year ago
  • do not change doc title
    fxmarty committed 1 year ago
  • make fix copies
    fxmarty committed 1 year ago
Loading