onnxruntime
[CUDA] Run FlashAttention regression test only when FlashAttention is available
#27206
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
3
Changes
View On
GitHub
[CUDA] Run FlashAttention regression test only when FlashAttention is available
#27206
hariharans29
merged 3 commits into
main
from
hari/fix_GQA_build_errors
Run FLashAttention regression test only when FlashAttention is available
4e9cc077
github-actions
commented on 2026-01-29
Update onnxruntime/test/python/transformers/test_gqa.py
970418d3
hariharans29
requested a review
from
tianleiwu
10 days ago
hariharans29
requested a review
from
copilot-pull-request-reviewer
10 days ago
copilot-pull-request-reviewer
commented on 2026-01-29
tianleiwu
approved these changes on 2026-01-29
hariharans29
enabled auto-merge (squash)
10 days ago
disabled auto-merge
10 days ago
Manually disabled by user
titaiwangms
approved these changes on 2026-01-30
Merge remote-tracking branch 'origin' into hari/fix_GQA_build_errors
65c07068
hariharans29
enabled auto-merge (squash)
9 days ago
hariharans29
merged
260a48c8
into main
5 days ago
hariharans29
deleted the hari/fix_GQA_build_errors branch
5 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
titaiwangms
tianleiwu
github-actions
copilot-pull-request-reviewer
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub