Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: Ollama Get "http://localhost:11434/api/tags": dial tcp 127.0.0.1:11434: connect: connection refused #1195

Open
guarapodecoco opened this issue Dec 12, 2024 · 4 comments
Labels
question Further information is requested

Comments

@guarapodecoco
Copy link

What is your question?

Hi mates, today i've installed fabric and im trying to run it powered by Ollama as default vendor but im having this issue with ollama api, i've configured it with its default port, i have to say that ollama is running in my local host ive checked. I was looking over and i found i may create the api extension on my local Ollama enviorment? ive tried localhost/api and there were no results in the navigator so this error may be from Ollama and not fabric? If anyone knows please help. Ive configured youtube api and the transcription tool is amazing!! great job

@guarapodecoco guarapodecoco added the question Further information is requested label Dec 12, 2024
@mrsatyp
Copy link

mrsatyp commented Jan 3, 2025

same here in windows .. but think it works via wsl linux in windows , need to confirm

@cipolino85
Copy link

I've installed fabric with OpenAI Installed also Ollama via Windows but I cannot configure Ollama as default vendor. Any ideas how to make it work for yt extract?

@mrsatyp
Copy link

mrsatyp commented Jan 8, 2025

I think it might be related to setting the Ollama endpoint whatever machine or docker etc it's running on as 'listening' there is command for that in one of the guides

Hopefully this resolves it if I'm on the same page as what you meant

🤞🏼

@mrsatyp
Copy link

mrsatyp commented Jan 8, 2025

The rest on the fabric side will be normal as you would have entered your http://:port info

Hopefully we can add multiple ollama endpoint at some point
Ta

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants