[Bug] Fix QwenImageEditPlus Series on NPU #13017
[Bug Fix][Qwen-Image-Edit] Fix Qwen-Image-Edit series on NPU
b103f42c
Enhance NPU attention handling by converting attention mask to boolea…
3ed2a75f
Refine attention mask handling in NPU attention function to improve v…
50055640
Clean Code
e042b0d1
Refine attention mask processing in NPU attention functions to enhanc…
5c92a776
Remove item() ops on npu fa backend.
8abfddd5
Reuse NPU attention mask by `_maybe_modify_attn_mask_npu`
020a2327
sayakpaul
approved these changes
on 2026-01-23
Merge branch 'main' into fix_npu_related_error
34da3368
Merge branch 'main' into fix_npu_related_error
16fbf491
Apply style fixes
662f9970
Merge branch 'main' into fix_npu_related_error
8d0ae8ac
Merge branch 'main' into fix_npu_related_error
d43aa786
DN6
commented
on 2026-02-16
Update src/diffusers/models/attention_dispatch.py
63335f83
DN6
merged
bcbbded7
into main 36 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub