pytorch
bfe5f5bb - [WIP] enable cuda graphs support for flash attention with dropout (#100196)

Commit
2 years ago
[WIP] enable cuda graphs support for flash attention with dropout (#100196) Fixes #99905 Pull Request resolved: https://github.com/pytorch/pytorch/pull/100196 Approved by: https://github.com/drisspg
Author
Natalia Gimelshein
Committer
Parents
Loading