diffusers
Add Photon model and pipeline support
#12456
Open

Add Photon model and pipeline support #12456

DavidBert wants to merge 41 commits into huggingface:main from Photoroom:photon
DavidBert
Add Photon model and pipeline support
fa8faeda
DavidBert
DavidBert commented on 2025-10-09
DavidBert
DavidBert commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
sayakpaul
sayakpaul commented on 2025-10-09
just store the T5Gemma encoder
64ddfe52
enhance_vae_properties if vae is provided only
2947da0d
remove autocast for text encoder forwad
2575997e
david-PHR BF16 example
27421cb4
conditioned CFG
1321ab4c
remove enhance vae and use vae.config directly when possible
32807a16
move PhotonAttnProcessor2_0 in transformer_photon
117e835a
remove einops dependency and now inherits from AttentionMixin
c86aed24
unify the structure of the forward block
5f6359f4
update doc
33961430
DavidBert DavidBert force pushed from fafd7747 to 33961430 7 days ago
update doc
c78f4449
fix T5Gemma loading from hub
3f703955
fix timestep shift
d09ff3c1
sayakpaul
sayakpaul commented on 2025-10-13
remove lora support from doc
91486cfc
DavidBert Rename EmbedND for PhotoEmbedND
23dd1817
DavidBert remove modulation dataclass
7efad332
DavidBert put _attn_forward and _ffn_forward logic in PhotonBlock's forward
ef9c48d5
DavidBert renam LastLayer for FinalLayer
178cc6ef
DavidBert remove lora related code
924643aa
DavidBert rename vae_spatial_compression_ratio for vae_scale_factor
faa00b93
DavidBert support prompt_embeds in call
804dafd0
DavidBert move xattention conditionning out computation out of the denoising loop
6f90e418
DavidBert add negative prompts
59f4bda1
DavidBert Use _import_structure for lazy loading
9ad57202
DavidBert make quality + style
027dbd52
DavidBert DavidBert force pushed from 52b585fe to 4aeccfea 2 days ago
DavidBert add pipeline test + corresponding fixes
ff28f65f
DavidBert DavidBert force pushed from 4aeccfea to ff28f65f 2 days ago
DavidBert DavidBert requested a review from sayakpaul sayakpaul 2 days ago
sayakpaul sayakpaul removed review request from sayakpaul sayakpaul 2 days ago
sayakpaul sayakpaul requested a review from dg845 dg845 2 days ago
sayakpaul sayakpaul requested a review from stevhliu stevhliu 2 days ago
DavidBert utility function that determines the default resolution given the VAE
28b9cf2d
DavidBert DavidBert requested a review from sayakpaul sayakpaul 2 days ago
stevhliu
stevhliu commented on 2025-10-15
dg845
dg845 commented on 2025-10-16
dg845
dg845 commented on 2025-10-16
dg845
dg845 commented on 2025-10-16
dg845
dg845 commented on 2025-10-16
dg845
dg845 commented on 2025-10-16
dg845
dg845 commented on 2025-10-16
dg845
dg845 commented on 2025-10-16
DavidBert Refactor PhotonAttention to match Flux pattern
b596595a
DavidBert built-in RMSNorm
c522119f
DavidBert Revert accidental .gitignore change
3239f265
DavidBert parameter names match the standard diffusers conventions
b7bbb04a
DavidBert renaming and remove unecessary attributes setting
83e03965
DavidBert Update docs/source/en/api/pipelines/photon.md
582b64ad
DavidBert Update docs/source/en/api/pipelines/photon.md
33926e05
DavidBert Update docs/source/en/api/pipelines/photon.md
c9e0a206
DavidBert Update docs/source/en/api/pipelines/photon.md
2877b60c
DavidBert quantization example
ed874752
DavidBert added doc to toctree
8aa65bae
DavidBert
DavidBert DavidBert requested a review from stevhliu stevhliu 1 day ago
DavidBert DavidBert requested a review from dg845 dg845 1 day ago
DavidBert Merge branch 'photon' of https://github.com/Photoroom/diffusers into …
fba7b330
stevhliu
stevhliu approved these changes on 2025-10-16
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
DavidBert use dispatch_attention_fn for multiple attention backend support
bef08450
DavidBert DavidBert force pushed from 141337cd to bef08450 11 hours ago
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17
dg845
dg845 commented on 2025-10-17

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone