Explanation about LoRA merging into checkpoint #356
Replies: 2 comments
-
I'm wondering about this as well. Merging loras doesn't work at all. I tried everything and I still get completely different results for whatever reason. |
Beta Was this translation helpful? Give feedback.
-
I have had luck merging lora to lora. Best when they are very dissimilar. When there is slight similarity it just makes for a mess. There are exceptions, as long as all but one of the lora are fairly weak or overly generalized. I use between 0.1 to full 1.0 for each lora weight, depending on it's strength and how much of it I want reflected in the final product. Lots of trial and error. However, merging lora into checkpoints I have had very little luck with and would also lover some advise! |
Beta Was this translation helpful? Give feedback.
-
Hello!
I tried the "Merge to checkpoint" option from this extension and I used weights of 0.5 for 3 LoRAs, but then why I use the LoRAs keywords in the newly resulted base checkpoint, I do not get the style of the LoRAs at all.
Where is the problem and how should this be done
Thank you and I appreciate your help!
Beta Was this translation helpful? Give feedback.
All reactions