onnxruntime
[CUDA] Run FlashAttention regression test only when FlashAttention is available
#27206
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
3
Changes
View On
GitHub
Commits
Run FLashAttention regression test only when FlashAttention is available
hariharans29
committed
76 days ago
Update onnxruntime/test/python/transformers/test_gqa.py
hariharans29
committed
76 days ago
Merge remote-tracking branch 'origin' into hari/fix_GQA_build_errors
hariharans29
committed
75 days ago
Loading