transformers
8e077a3e
- Fix re-compilations for cross attention cache (#39788)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
175 days ago
Fix re-compilations for cross attention cache (#39788) fix recompilations for cross attn cache
References
#39788 - Fix re-compilations for cross attention cache
Author
zucchini-nlp
Parents
1e0665a1
Loading