diffusers
[Fix] enable_xformers_memory_efficient_attention() in Flux Pipeline
#12337
Merged

[Fix] enable_xformers_memory_efficient_attention() in Flux Pipeline #12337

DN6 merged 5 commits into huggingface:main from SahilCarterr:xformers_flux
SahilCarterr
SahilCarterr FIxes enable_xformers_memory_efficient_attention()
6c5637c7
JoeGaffney
JoeGaffney commented on 2025-09-16
SahilCarterr Update attention.py
ca45902f
SahilCarterr Merge branch 'main' into xformers_flux
cb0baf8d
SahilCarterr Merge branch 'main' into xformers_flux
5474c3a7
SahilCarterr Merge branch 'main' into xformers_flux
f4c50992
DN6
DN6 approved these changes on 2025-09-22
HuggingFaceDocBuilderDev
DN6 DN6 merged 78031c29 into main 197 days ago
SahilCarterr SahilCarterr deleted the xformers_flux branch 197 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone