Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incomplete tutorial #395

Open
2 of 4 tasks
sleepingcat4 opened this issue Sep 3, 2024 · 0 comments
Open
2 of 4 tasks

Incomplete tutorial #395

sleepingcat4 opened this issue Sep 3, 2024 · 0 comments

Comments

@sleepingcat4
Copy link

System Info

3090 GPU (Ampere model)

Information

  • Docker
  • The CLI directly

Tasks

  • An officially supported command
  • My own modifications

Reproduction

I was following this tutorial and realised it was missing some important pointers, like how I could be using NVIDIA cards for running the tei gpu locally. I was able to install Nvidia container toolkit after reading the Github README file. Yet, my docker was unable to detect the CUDA devices and was running on CPU.

Note: I can use CUDA on my Jupyter Notebook and I havetwo 3090 on my remote node.

https://huggingface.co/docs/text-embeddings-inference/en/local_gpu

Expected behavior

Docker should be able to detect my GPUs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant