diffusers
[LoRA] fix `cross_attention_kwargs` problems and tighten tests
#7388
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
10
Changes
View On
GitHub
Commits
debugging
sayakpaul
committed
2 years ago
let's see the numbers
sayakpaul
committed
2 years ago
let's see the numbers
sayakpaul
committed
2 years ago
let's see the numbers
sayakpaul
committed
2 years ago
restrict tolerance.
sayakpaul
committed
2 years ago
increase inference steps.
sayakpaul
committed
2 years ago
shallow copy of cross_attentionkwargs
sayakpaul
committed
2 years ago
remove print
sayakpaul
committed
2 years ago
Merge branch 'main' into debug-lora-scale-issue
sayakpaul
committed
2 years ago
Merge branch 'main' into debug-lora-scale-issue
sayakpaul
committed
2 years ago
Loading