[webgpu] Fix the wrong fallback in Attention #26608
[webgpu] Fix the wrong fallback in Attention
d8666005
qjia7
marked this pull request as ready for review 76 days ago
guschmue
approved these changes
on 2025-11-19
qjia7
merged
81a04ca4
into main 75 days ago
qjia7
deleted the fix_attention branch 75 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub