Context Parallel w/ Ring & Ulysses & Unified Attention (#11941)
* update
* update
* add coauthor
Co-Authored-By: Dhruv Nair <dhruv.nair@gmail.com>
* improve test
* handle ip adapter params correctly
* fix chroma qkv fusion test
* fix fastercache implementation
* fix more tests
* fight more tests
* add back set_attention_backend
* update
* update
* make style
* make fix-copies
* make ip adapter processor compatible with attention dispatcher
* refactor chroma as well
* remove rmsnorm assert
* minify and deprecate npu/xla processors
* update
* refactor
* refactor; support flash attention 2 with cp
* fix
* support sage attention with cp
* make torch compile compatible
* update
* refactor
* update
* refactor
* refactor
* add ulysses backward
* try to make dreambooth script work; accelerator backward not playing well
* Revert "try to make dreambooth script work; accelerator backward not playing well"
This reverts commit 768d0ea6fa6a305d12df1feda2afae3ec80aa449.
* workaround compilation problems with triton when doing all-to-all
* support wan
* handle backward correctly
* support qwen
* support ltx
* make fix-copies
* Update src/diffusers/models/modeling_utils.py
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com>
* apply review suggestions
* update docs
* add explanation
* make fix-copies
* add docstrings
* support passing parallel_config to from_pretrained
* apply review suggestions
* make style
* update
* Update docs/source/en/api/parallel.md
Co-authored-by: Aryan <aryan@huggingface.co>
* up
---------
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com>
Co-authored-by: sayakpaul <spsayakpaul@gmail.com>