onnxruntime
[CUDA] Run FlashAttention regression test only when FlashAttention is available
#27206
Merged

[CUDA] Run FlashAttention regression test only when FlashAttention is available #27206

hariharans29 merged 3 commits into main from hari/fix_GQA_build_errors
hariharans29
hariharans29 Run FLashAttention regression test only when FlashAttention is available
4e9cc077
github-actions
github-actions commented on 2026-01-29
hariharans29 Update onnxruntime/test/python/transformers/test_gqa.py
970418d3
hariharans29 hariharans29 requested a review from tianleiwu tianleiwu 10 days ago
hariharans29 hariharans29 requested a review from copilot-pull-request-reviewer copilot-pull-request-reviewer 10 days ago
copilot-pull-request-reviewer
copilot-pull-request-reviewer commented on 2026-01-29
tianleiwu
tianleiwu approved these changes on 2026-01-29
hariharans29 hariharans29 enabled auto-merge (squash) 10 days ago
disabled auto-merge 10 days ago
Manually disabled by user
titaiwangms
titaiwangms approved these changes on 2026-01-30
hariharans29 Merge remote-tracking branch 'origin' into hari/fix_GQA_build_errors
65c07068
hariharans29 hariharans29 enabled auto-merge (squash) 9 days ago
hariharans29 hariharans29 merged 260a48c8 into main 5 days ago
hariharans29 hariharans29 deleted the hari/fix_GQA_build_errors branch 5 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone