Not all models LORA work with GGUF q8 #1778
SunGreen777
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Not all models LORA work with GGUF q8. Some simply have no influence. Is there a solution?
Beta Was this translation helpful? Give feedback.
All reactions