pytorch
90537a77 - Update FlashAttention to work with sm90 Gpus (#97051)

Commit
1 year ago
Update FlashAttention to work with sm90 Gpus (#97051) # Summary FlashAttention was confirmed to work on h100 and sm90 hardware so we update the checks to account for this Pull Request resolved: https://github.com/pytorch/pytorch/pull/97051 Approved by: https://github.com/cpuhrsch
Author
Committer
Parents
Loading