-
from "The main goal is to run the model using 4-bit quantization on a MacBook" means I'm guessing it's for iOS and I have no business being in this repo. There's a mix of windows and apple commands and I can't tell which. ls ./models Does the ls command even work on windows? |
Beta Was this translation helpful? Give feedback.
Answered by
x02Sylvie
Apr 5, 2023
Replies: 1 comment 4 replies
-
Instructions may be confusing but llama.cpp works both on windows, linux and mac |
Beta Was this translation helpful? Give feedback.
4 replies
Answer selected by
Njasa2k
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Instructions may be confusing but llama.cpp works both on windows, linux and mac