[Alpha-VLLM Team] Add Lumina-T2X to diffusers #8652
init lumina-t2i pipeline
05160385
added pipeline code
bd2445ea
added flag-dit and next-dit model
65a6991c
fixed typo and added test code
a0f7e183
init lumina-t2i pipeline
dfb826e5
added pipeline code
e516d507
added flag-dit and next-dit model
6db8b822
fixed typo and added test code
4b598ad6
reformated demo and models
609f3db0
Add heun sampler for flow matching models
08fcefb2
Added Lumina-Next-SFT model to diffusers
576171c8
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
b707add9
Format code style and fixed merge unused code
f93b9033
Updated docs about lumina
1ad8e2b1
Fixed timestep scale
627b3836
PommesPeter
changed the title Add Lumina-T2X to diffusers [WIP] Add Lumina-T2X to diffusers 1 year ago
Fixed import error
d50b85ea
Fixed bug on flow match heun
18762c88
Update: run the pipeline successfully
e3b20b1a
Removed unused files
a6d34b4d
Fixed bugs
8c40b5c5
Fixed bugs
63331aec
Fixed prompt embedding bugs
f45485e4
Removed unused code
c49c16bb
Fix bugs
69b02cbf
Add lumina tests
cf2da8b5
Implement attention in diffusres
759781ee
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
5c9739ec
Fixed AttnProcessor
367e9f97
Delete debug.py
2da4cbbb
Fixed convert scripts
21999fc3
Format code quality and style
8b0d0968
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
042e01b2
PommesPeter
changed the title [WIP] Add Lumina-T2X to diffusers [Alpha-VLLM Team] Add Lumina-T2X to diffusers 1 year ago
Refactor qknorm in attention processor
47cf4646
Updated attention implementation and models
947e002e
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
43e7464d
Update src/diffusers/models/attention.py
11b54e7e
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
0485471f
Updated attention implementation and models
097c5fa8
Updated attention implementation and models
d78f0e03
Merge branch 'main' into lumina
b2c04412
Fixed bugs
0b559a07
Format code
36f7e114
Update src/diffusers/models/transformers/lumina_nextdit2d.py
940769df
Update src/diffusers/pipelines/lumina/pipeline_lumina.py
cc692ee9
Update src/diffusers/models/transformers/lumina_nextdit2d.py
599a1ed4
Update src/diffusers/models/transformers/lumina_nextdit2d.py
07262996
Update src/diffusers/models/transformers/lumina_nextdit2d.py
fd6e9ed7
Update src/diffusers/models/transformers/lumina_nextdit2d.py
b5e76d64
Update src/diffusers/pipelines/lumina/pipeline_lumina.py
4379b31a
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
29770b12
Refactor proportational attention
b692707e
Refactor freqs_cis
4f73fbef
Fxied typo
388e07ca
Removed init weight distribution and typo
73f69b7a
Fix some bugs in attnetion
91c934ca
Fix bugs in attention
f84ab691
Fixed convert weight scripts
327c31e8
Fixed typo
899a5c05
Update src/diffusers/models/attention_processor.py
38cbaf9b
Update src/diffusers/models/attention_processor.py
082665c2
Update src/diffusers/models/attention_processor.py
f6c5a182
Update src/diffusers/models/attention_processor.py
a9410c86
Update src/diffusers/models/attention_processor.py
02378b00
Update src/diffusers/models/transformers/lumina_nextdit2d.py
a81c554d
Update src/diffusers/models/attention_processor.py
aee650dd
Update src/diffusers/models/transformers/lumina_nextdit2d.py
e9a45c3e
Update src/diffusers/models/attention_processor.py
98eb745f
Refactor attention output and Removed residual in Attn
df2b7d05
Apply suggestions from code review
6cd9936d
Update src/diffusers/models/transformers/lumina_nextdit2d.py
127d1df1
Apply suggestions from code review
f51c75cb
Fixed name of FFN
cc881011
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
e637fd54
Apply suggestions from code review
c70694f9
Renamed input name
f0904b13
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
6fa84cc0
Updated rotary emb
c589ce65
Remove useless codes
07129102
Apply suggestions from code review
32a163c1
Updated variable name
d57cc16e
Refactor positional embedding
0b197c45
Refactor positional embedding
e232b8c1
Updated AdaLN
8ea9c276
Merge branch 'lumina' of https://github.com/PommesPeter/diffusers int…
93d458d5
Added comment about time-aware denosing and Fixed a bug from typo
eb94171d
Fixed code format and Removed unused code
780c9457
Fixed code format and Removed unused code
0f596b6a
Removed unpatchify
cf1f2379
Update src/diffusers/models/transformers/lumina_nextdit2d.py
b2a834c0
Update src/diffusers/models/attention_processor.py
0da4a17f
Fixed typo
2981da0c
Run style and fix-copies
3694034c
Fixed typo and docs
800dfeb4
added new scheduler
5c1a965c
updated fix-copies
dc821ed1
updated fix-copies
021d5a66
Fixed style
5e19d486
Refactor test func with dummy model
425d6928
Update tests/pipelines/lumina/test_lumina_nextdit.py
7e458373
update test
0547f4a2
Merge branch 'main' into lumina
fb2da651
Updated slow test case
19aa5853
style
44daf8c5
yiyixuxu
merged
98388670
into main 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub