[LoRA] feat: support loading loras into 4bit quantized Flux models. #10578
feat: support loading loras into 4bit quantized models.
779c17b7
sayakpaul
changed the title [WIP] [LoRA] feat: support loading loras into 4bit quantized models. [WIP] [LoRA] feat: support loading loras into 4bit quantized Flux models. 1 year ago
Merge branch 'main' into 4bit-lora-loading
f46ba420
updates
d3d8ef28
sayakpaul
marked this pull request as ready for review 1 year ago
sayakpaul
changed the title [WIP] [LoRA] feat: support loading loras into 4bit quantized Flux models. [LoRA] feat: support loading loras into 4bit quantized Flux models. 1 year ago
update
8b13c1e4
DN6
commented
on 2025-01-15
remove weight check.
c92758fb
Merge branch 'main' into 4bit-lora-loading
a3f533b8
DN6
approved these changes
on 2025-01-15
sayakpaul
merged
2432f80c
into main 1 year ago
sayakpaul
deleted the 4bit-lora-loading branch 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub