transformers
Fix `torch.compile` with `fullgraph=True` when `attention_mask` input is used
#29211
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
3
Changes
View On
GitHub
Commits
fix torch.export.export for llama
fxmarty
committed
1 year ago
do not change doc title
fxmarty
committed
1 year ago
make fix copies
fxmarty
committed
1 year ago
Loading