Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug : in ollama llm the version is not so in drop down #869

Open
itsKayumkhan opened this issue Dec 22, 2024 · 17 comments
Open

bug : in ollama llm the version is not so in drop down #869

itsKayumkhan opened this issue Dec 22, 2024 · 17 comments

Comments

@itsKayumkhan
Copy link

Describe the bug

image

the version is not show in 7b version of ollama

Link to the Bolt URL that caused the error

http://localhost:3000/chat/3

Steps to reproduce

go to home page
select the ollama model

Expected behavior

not show

Screen Recording / Screenshot

image

Platform

  • OS: [e.g. macOS, Windows, Linux]
  • Browser: [e.g. Chrome, Safari, Firefox]
  • Version: [e.g. 91.1]

Provider Used

brave

Model Used

ollama

Additional context

No response

@Soumyaranjan-17
Copy link

Are you running ollama
check it on http://127.0.0.1:11434
or on http://127.0.0.1:11434/api/tags

@afhanxd
Copy link

afhanxd commented Dec 23, 2024

I am facing the same issue, the model does not appear there, i followed all steps, ollama is running checked from http://127.0.0.1:11434/ but still same issue

@Soumyaranjan-17
Copy link

Soumyaranjan-17 commented Dec 23, 2024

Can you show some screenshots

  1. Terminal from the browser
  2. Terminal from your code editor
  3. .env (where you have set the localhost address)
  4. Setting of BOLT UI (provider)

@afhanxd
Copy link

afhanxd commented Dec 23, 2024

Here are some screenshots

Screenshot 2024-12-23 092001
Screenshot 2024-12-23 092011
Screenshot 2024-12-23 092103
Screenshot 2024-12-23 092126

@amanguleria-1
Copy link

You will have to go to the providers from the settings tab and scroll down to ollama and paste ollama base url as - [ http://localhost:11434 ] (default port used to interact with ollama). Refresh and it shows up now.

@afhanxd
Copy link

afhanxd commented Dec 23, 2024

Damn Bro, it worked. thank u for helping me out but im having another problem now it shows this

Failed to get Ollama models: TypeError: fetch failed
at node:internal/deps/undici/undici:13484:13
at processTicksAndRejections (node:internal/process/task_queues:105:5)
at OllamaProvider.getDynamicModels (C:/Users/afkha/OneDrive/Desktop/OTTODEV/app/lib/modules/llm/providers/ollama.ts:41:24)
at async Promise.all (index 0)
at LLMManager.updateModelList (C:/Users/afkha/OneDrive/Desktop/OTTODEV/app/lib/modules/llm/manager.ts:71:27)
at Module.getModelList (C:/Users/afkha/OneDrive/Desktop/OTTODEV/app/utils/constants.ts:43:10)
at Module.streamText (C:/Users/afkha/OneDrive/Desktop/OTTODEV/app/lib/.server/llm/stream-text.ts:99:22)
at chatAction (C:/Users/afkha/OneDrive/Desktop/OTTODEV/app/routes/api.chat.ts:101:20)
at Object.callRouteAction (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected][email protected]\node_modules@remix-run\server-runtime\dist\data.js:36:16)
at C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:4899:19
at callLoaderOrAction (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:4963:16)
at async Promise.all (index 0)
at defaultDataStrategy (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:4772:17)
at callDataStrategyImpl (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:4835:17)
at callDataStrategy (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:3992:19)
at submit (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:3755:21)
at queryImpl (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:3684:22)
at Object.queryRoute (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected]\node_modules@remix-run\router\router.ts:3629:18)
at handleResourceRequest (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected][email protected]\node_modules@remix-run\server-runtime\dist\server.js:402:20)
at requestHandler (C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@[email protected][email protected]\node_modules@remix-run\server-runtime\dist\server.js:156:18)
at C:\Users\afkha\OneDrive\Desktop\OTTODEV\node_modules.pnpm@remix-run+dev@2.15.0_@remix-run[email protected][email protected]_react@[email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy\node_modules@remix-run\dev\dist\vite\cloudflare-proxy-plugin.js:70:25 {
[cause]: Error: connect ECONNREFUSED ::1:11434
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1615:16)
at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '::1',
port: 11434
}
}

image

@Laslink
Copy link

Laslink commented Dec 23, 2024

me too whit ollama or api key openia errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '::1',
port: 11434

@Soumyaranjan-17
Copy link

Use http://127.0.0.1:11434 instead of localhost: 11434
In .env and in BOLT UI setting provider also

@Soumyaranjan-17
Copy link

me too whit ollama or api key openia errno: -4078, code: 'ECONNREFUSED', syscall: 'connect', address: '::1', port: 11434

Don't use http://localhost:11434
Use http://127.0.0.1:11434

Because IPV6 conversation will convert the localhost to ::1:11434 which will make some issue

@afhanxd
Copy link

afhanxd commented Dec 24, 2024

@Soumyaranjan-17 bruhhh it worked 😭😭😭 thanks. Also i am only able to use 05b-1.5b anything above that just makes my laptop lag.

Laptop

16GB LPDDR5X
NVIDIA® GeForce RTX™ 4060 Laptop GPU (233 AI TOPs)
8GB GDDR6
AMD Radeon™ Graphics
AMD Ryzen™ 9 7940HS Mobile Processor 4.0GHz (8-core/16-thread, 16MB cache, up to 5.2GHz max boost)

@Soumyaranjan-17
Copy link

@afhanxd Congratulations bro....
But actually I don't have knowledge about the system hardware
Because I only have 6GB of ram without any dedicated graphics card 😐

But I think qwen 7b must work on your computer. As it works in mine also 😅

If any issues you are facing show some screenshots of errors and config.

@Mayur1804
Copy link

Can someone help me with this why I'm not able to access

image image

@Soumyaranjan-17
Copy link

@Mayur1804 Have you installed the model that you are using !?

@oliverbarreto
Copy link

oliverbarreto commented Jan 1, 2025

The same here... can you provide help !!!

Current Setup with Windows 11 Host:

  • Native Ollama instance working perfectly fine with Open WebUI and python scripts
  • Bolt.diy using Docker... and working PERFECTLY using Google API KEY and Gemini 2 Model... but not with Ollama local models
  • Environment Variable for OLLAMA: OLLAMA_HOST=0.0.0.0:11434
  • If I configure Ollama Provider in Bolt.diy as "http:127.0.0.1:11434" nor "http://host_IP:11434" i cannot see available models in my working ollama server.
  • If i configure Ollama Provider in Bolt.diy as "https://ollama.mydomain.duckdns.org/" and configure a reverse proxy host for ollama server with a DNS rule pointing to "host_IP:11434"... i can see the dropdown of available models, but when i hit send... I get the same error as previous comments

Captura de pantalla 2025-01-01 182504
Captura de pantalla 2025-01-01 181307
Captura de pantalla 2025-01-01 181322

LOGS:
2025-01-01 18:48:00 app-dev-1 | INFO LLMManager Found 10 cached models for Ollama
2025-01-01 18:48:00 app-dev-1 | INFO stream-text Sending llm call to Ollama with model llama3.2:latest
2025-01-01 18:48:00 app-dev-1 | ERROR api.chat TypeError: Cannot read properties of undefined (reading 'replace')
2025-01-01 18:48:00 app-dev-1 | at OllamaProvider.getModelInstance (/app/app/lib/modules/llm/providers/ollama.ts:59:34)
2025-01-01 18:48:00 app-dev-1 | at Module.streamText (/app/app/lib/.server/llm/stream-text.ts:156:21)
2025-01-01 18:48:00 app-dev-1 | at processTicksAndRejections (node:internal/process/task_queues:95:5)
2025-01-01 18:48:00 app-dev-1 | at chatAction (/app/app/routes/api.chat.ts:116:20)
2025-01-01 18:48:00 app-dev-1 | at Object.callRouteAction (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:36:16)
2025-01-01 18:48:00 app-dev-1 | at /app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4899:19
2025-01-01 18:48:00 app-dev-1 | at callLoaderOrAction (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4963:16)
2025-01-01 18:48:00 app-dev-1 | at async Promise.all (index 0)
2025-01-01 18:48:00 app-dev-1 | at defaultDataStrategy (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4772:17)
2025-01-01 18:48:00 app-dev-1 | at callDataStrategyImpl (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4835:17)
2025-01-01 18:48:00 app-dev-1 | at callDataStrategy (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3992:19)
2025-01-01 18:48:00 app-dev-1 | at submit (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3755:21)
2025-01-01 18:48:00 app-dev-1 | at queryImpl (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3684:22)
2025-01-01 18:48:00 app-dev-1 | at Object.queryRoute (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3629:18)
2025-01-01 18:48:00 app-dev-1 | at handleResourceRequest (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:402:20)
2025-01-01 18:48:00 app-dev-1 | at requestHandler (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
2025-01-01 18:48:00 app-dev-1 | at /app/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run[email protected][email protected]_react@[email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25

@Mayur1804
Copy link

Mayur1804 commented Jan 2, 2025

@Soumyaranjan-17

Yes. You can see in one of my screenshot saying Ollama is running

@nwslnk
Copy link

nwslnk commented Jan 3, 2025

ymmv but hope this might help someone landing here:

worth noting that if you're running ollama or bolt via https, that both services should to be using the same scheme or the request to list models may fail in your browser (check console for mixed http/https errors)

ollama & bolt via traefik w/o origins
image

if youre running bolt with ssl and ollama in another container you need to enable ssl for ollama as well or you'll see CORS issues in the browser console. You also need to pass something like -e OLLAMA_ORIGINS="*" (not in prod obviously) (https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama)

ollama via http, bolt via https
image

with bolt and ollama running over ssl via traefik with the origins env flag mentioned the model list loads correctly

with origins & both via ssl
image

also fwiw... if you're using podman you may need to update the compose file from "docker" to "containers", and it may need to be run as root vs rootless

    extra_hosts:
      - "host.containers.internal:host-gateway"

@zha0jf
Copy link

zha0jf commented Jan 10, 2025

@oliverbarreto
I encountered this issue as well. I looked at the code in app/lib/modules/llm/providers/ollama.ts and found the following lines:

// Backend: Check if we're running in Docker
const isDocker = process.env.RUNNING_IN_DOCKER === 'true';

baseUrl = isDocker ? baseUrl.replace('localhost', 'host.docker.internal') : baseUrl;
baseUrl = isDocker ? baseUrl.replace('127.0.0.1', 'host.docker.internal') : baseUrl;

If Bolt.diy is running in Docker and using Ollama, the OLLAMA_API_BASE_URL environment variable must be set. Otherwise, when the code reaches this point, it will throw an error and exit because the baseUrl is empty.
After configuring this environment variable in the docker-compose.yaml, the issue was resolved。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants