You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi mates, today i've installed fabric and im trying to run it powered by Ollama as default vendor but im having this issue with ollama api, i've configured it with its default port, i have to say that ollama is running in my local host ive checked. I was looking over and i found i may create the api extension on my local Ollama enviorment? ive tried localhost/api and there were no results in the navigator so this error may be from Ollama and not fabric? If anyone knows please help. Ive configured youtube api and the transcription tool is amazing!! great job
The text was updated successfully, but these errors were encountered:
I've installed fabric with OpenAI Installed also Ollama via Windows but I cannot configure Ollama as default vendor. Any ideas how to make it work for yt extract?
I think it might be related to setting the Ollama endpoint whatever machine or docker etc it's running on as 'listening' there is command for that in one of the guides
Hopefully this resolves it if I'm on the same page as what you meant
What is your question?
Hi mates, today i've installed fabric and im trying to run it powered by Ollama as default vendor but im having this issue with ollama api, i've configured it with its default port, i have to say that ollama is running in my local host ive checked. I was looking over and i found i may create the api extension on my local Ollama enviorment? ive tried localhost/api and there were no results in the navigator so this error may be from Ollama and not fabric? If anyone knows please help. Ive configured youtube api and the transcription tool is amazing!! great job
The text was updated successfully, but these errors were encountered: