You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was following this tutorial and realised it was missing some important pointers, like how I could be using NVIDIA cards for running the tei gpu locally. I was able to install Nvidia container toolkit after reading the Github README file. Yet, my docker was unable to detect the CUDA devices and was running on CPU.
Note: I can use CUDA on my Jupyter Notebook and I havetwo 3090 on my remote node.
System Info
3090 GPU (Ampere model)
Information
Tasks
Reproduction
I was following this tutorial and realised it was missing some important pointers, like how I could be using NVIDIA cards for running the
tei gpu locally
. I was able to install Nvidia container toolkit after reading the Github README file. Yet, my docker was unable to detect the CUDA devices and was running on CPU.Note: I can use CUDA on my Jupyter Notebook and I havetwo 3090 on my remote node.
https://huggingface.co/docs/text-embeddings-inference/en/local_gpu
Expected behavior
Docker should be able to detect my GPUs
The text was updated successfully, but these errors were encountered: