transformers
3bc50d81 - [`FA2`] Add flash attention for opt (#26414)

Commit
2 years ago
[`FA2`] Add flash attention for opt (#26414) * added flash attention for opt * added to list * fix use cache (#3) * style fix * fix text * test fix2 * reverted until 689f599 * torch fx tests are working now! * small fix * added TODO docstring * changes * comments and .md file modification --------- Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com>
Author
Parents
Loading