onnxruntime
260a48c8 - [CUDA] Run FlashAttention regression test only when FlashAttention is available (#27206)

Commit
3 days ago
[CUDA] Run FlashAttention regression test only when FlashAttention is available (#27206) ### Description As title. Checking if FlashAttention exists check includes if torch has CUDA support, the system has the right device to run FlashAttention, etc. ### Motivation and Context Fix Windows CUDA CI failures --------- Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Author
Parents
Loading