[Fix] enable_xformers_memory_efficient_attention() in Flux Pipeline #12337
FIxes enable_xformers_memory_efficient_attention()
6c5637c7
Update attention.py
ca45902f
Merge branch 'main' into xformers_flux
cb0baf8d
Merge branch 'main' into xformers_flux
5474c3a7
Merge branch 'main' into xformers_flux
f4c50992
DN6
approved these changes
on 2025-09-22
DN6
merged
78031c29
into main 197 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub