You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I can't get AnythingLLM to work on my Ubuntu 24.04 running on intel Xeon. Connected to Ollama running on the local machine.
I get the following error in the logs:
/usr/local/bin/docker-entrypoint.sh: line 7: 123 Illegal instruction (core dumped) node /app/server/index.js
The full log dump:
Collector hot directory and tmp storage wiped!
Document processor app listening on port 8888
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
✔ Generated Prisma Client (v5.3.1) to ./node_modules/@prisma/client in 496ms
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
Datasource "db": SQLite database "anythingllm.db" at "file:../storage/anythingllm.db"
20 migrations found in prisma/migrations
No pending migrations to apply.
┌─────────────────────────────────────────────────────────┐
│ Update available 5.3.1 -> 5.15.0 │
│ Run the following to update │
│ npm i --save-dev prisma@latest │
│ npm i @prisma/client@latest │
└─────────────────────────────────────────────────────────┘
[TELEMETRY ENABLED] Anonymous Telemetry enabled. Telemetry helps Mintplex Labs Inc improve AnythingLLM.
prisma:info Starting a sqlite pool with 13 connections.
fatal: not a git repository (or any of the parent directories): .git
getGitVersion Command failed: git rev-parse HEAD
fatal: not a git repository (or any of the parent directories): .git
[TELEMETRY SENT] {
event: 'server_boot',
distinctId: '9c10440f-e20f-43c9-b27f-062c6f731e19',
properties: { commit: '--', runtime: 'docker' }
}
[CommunicationKey] RSA key pair generated for signed payloads within AnythingLLM services.
Primary server in HTTP mode listening on port 3001
OpenAIError: The OPENAI_API_KEY environment variable is missing or empty; either provide it, or instantiate the OpenAI client with an apiKey option, like new OpenAI({ apiKey: 'My API Key' }).
at new OpenAI (/app/server/node_modules/openai/index.js:53:19)
at openAiModels (/app/server/utils/helpers/customModels.js:59:18)
at getCustomModels (/app/server/utils/helpers/customModels.js:29:20)
at /app/server/endpoints/system.js:929:41
at Layer.handle [as handle_request] (/app/server/node_modules/express/lib/router/layer.js:95:5)
at next (/app/server/node_modules/express/lib/router/route.js:149:13)
at validatedRequest (/app/server/utils/middleware/validatedRequest.js:18:5)
[Event Logged] - update_llm_provider
[Event Logged] - update_embedding_engine
[Event Logged] - update_vector_db
[TELEMETRY SENT] {
event: 'workspace_created',
distinctId: '9c10440f-e20f-43c9-b27f-062c6f731e19',
properties: {
multiUserMode: false,
LLMSelection: 'ollama',
Embedder: 'native',
VectorDbSelection: 'lancedb',
runtime: 'docker'
}
}
[Event Logged] - workspace_created
[TELEMETRY SENT] {
event: 'onboarding_complete',
distinctId: '9c10440f-e20f-43c9-b27f-062c6f731e19',
properties: { runtime: 'docker' }
}
[NativeEmbedder] Initialized
/usr/local/bin/docker-entrypoint.sh: line 7: 123 Illegal instruction (core dumped) node /app/server/index.js
I tried to search the closed issues and landed on #1265 which I guess is a similar situation but I'm new to linux and can't wrap my head around the solution
Are there known steps to reproduce?
No response
The text was updated successfully, but these errors were encountered:
How are you running AnythingLLM?
Docker (local)
What happened?
I can't get AnythingLLM to work on my Ubuntu 24.04 running on intel Xeon. Connected to Ollama running on the local machine.
I get the following error in the logs:
The full log dump:
I tried to search the closed issues and landed on #1265 which I guess is a similar situation but I'm new to linux and can't wrap my head around the solution
Are there known steps to reproduce?
No response
The text was updated successfully, but these errors were encountered: