You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I deployed a .elser_model_2_linux-x86_64 model (Elastic Learned Sparse EncodeR v2 optimized for linux-x86_64 version 12.0.0) on a ML node for using Knowledge Base. During the setup of the knowledge base, the task crashed due to a problem with the inference of the model. Before the inference, the model is declared as "started" but becomes "failed" after inference.
My version is 8.15.3 (from Docker - Linux) and my CPU is Intel(R) Xeon(R) CPU E7540.
The problem seems to be linked to pytorch library but I don't have any idea to fix that.
Elasticsearch Version
8.15.3
Installed Plugins
No response
Java Version
bundled
OS Version
docker elasticsearch:8.15.3
Problem Description
I deployed a .elser_model_2_linux-x86_64 model (Elastic Learned Sparse EncodeR v2 optimized for linux-x86_64 version 12.0.0) on a ML node for using Knowledge Base. During the setup of the knowledge base, the task crashed due to a problem with the inference of the model. Before the inference, the model is declared as "started" but becomes "failed" after inference.
My version is 8.15.3 (from Docker - Linux) and my CPU is Intel(R) Xeon(R) CPU E7540.
The problem seems to be linked to pytorch library but I don't have any idea to fix that.
Related issue: #106206
Steps to Reproduce
Logs (if relevant)
The text was updated successfully, but these errors were encountered: