[XPU] Add flash_attn2 support for XPU #41956
Add flash_attention_2 and kernels-community/flash-attn support for XPU
6ea866d6
Add flash-attn-2 support for XPU
120807f1
Delete deterministic algorithm for xpu
2dbe3a6a
Fix code style
e577daff
Merge branch 'main' into main
bfca1444
Modify repo_id to match the latest kernels-community/flash-attn2
44cc0ef3
Merge branch 'main' into main
c97650de
Fix code style
295a2ce3
Merge branch 'main' into main
879ed78c
YangKai0616
marked this pull request as ready for review 139 days ago
vasqu
commented
on 2025-11-11
Update
aef47646
Make quality
555e2f02
Merge branch 'main' into main
b19ba582
Merge branch 'main' into main
4f6ba8b1
Use kernels loading
4c2b5f3e
vasqu
commented
on 2025-11-20
Update
576078a3
Delete invalid import
4a96b2ad
Merge branch 'main' into main
7eec9d14
vasqu
approved these changes
on 2025-11-20
Update comment
8965085f
Merge branch 'main' into main
e240a952
Merge branch 'main' into main
61064aab
vasqu
enabled auto-merge (squash) 129 days ago
Merge branch 'main' into main
bb524710
vasqu
merged
07bfd2f8
into main 129 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub