diffusers
[LoRA] feat: support loading loras into 4bit quantized Flux models.
#10578
Merged

[LoRA] feat: support loading loras into 4bit quantized Flux models. #10578

sayakpaul merged 6 commits into main from 4bit-lora-loading
sayakpaul
sayakpaul feat: support loading loras into 4bit quantized models.
779c17b7
sayakpaul sayakpaul requested a review from BenjaminBossan BenjaminBossan 1 year ago
sayakpaul sayakpaul changed the title [WIP] [LoRA] feat: support loading loras into 4bit quantized models. [WIP] [LoRA] feat: support loading loras into 4bit quantized Flux models. 1 year ago
sayakpaul
sayakpaul commented on 2025-01-14
sayakpaul sayakpaul requested a review from matthewdouglas matthewdouglas 1 year ago
HuggingFaceDocBuilderDev
BenjaminBossan
BenjaminBossan commented on 2025-01-14
sayakpaul
BenjaminBossan
sayakpaul
BenjaminBossan
matthewdouglas
sayakpaul
sayakpaul
sayakpaul Merge branch 'main' into 4bit-lora-loading
f46ba420
sayakpaul updates
d3d8ef28
sayakpaul sayakpaul marked this pull request as ready for review 1 year ago
sayakpaul sayakpaul changed the title [WIP] [LoRA] feat: support loading loras into 4bit quantized Flux models. [LoRA] feat: support loading loras into 4bit quantized Flux models. 1 year ago
sayakpaul sayakpaul requested a review from DN6 DN6 1 year ago
sayakpaul
sayakpaul update
8b13c1e4
sayakpaul sayakpaul requested a review from BenjaminBossan BenjaminBossan 1 year ago
DN6
DN6 commented on 2025-01-15
sayakpaul remove weight check.
c92758fb
sayakpaul sayakpaul requested a review from DN6 DN6 1 year ago
sayakpaul
sayakpaul Merge branch 'main' into 4bit-lora-loading
a3f533b8
DN6
DN6 approved these changes on 2025-01-15
sayakpaul sayakpaul merged 2432f80c into main 1 year ago
sayakpaul sayakpaul deleted the 4bit-lora-loading branch 1 year ago
BenjaminBossan

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone