forked from ollama/ollama
-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Has anyone got this working? It seems not to #22
Labels
bug
Something isn't working
Comments
same, not working at all. Gradio could not run, it prompts "connection errored out"
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What is the issue?
I spent a lot of time trying to get this to work and it seems it just doesn't, others are having issues too. I am wondering what conditions it works under.
It seems unable to be built on mac, after adding fopenmp it shows set num thread errors. On linux the build can't run models. Has anyone at all got it working? under what conditions?
I am posting it in this issue page for visibility to avoid others from wasting time if it just doesn't work (to me the docs kind of imply a level of workingness which isn't there)
OS
Linux, macOS
GPU
Nvidia, Apple
CPU
Intel, AMD, Apple
Ollama version
No response
The text was updated successfully, but these errors were encountered: