-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use multiple GPUs to finetune the model? #7
Comments
You can use the |
Got it. Thank you. |
@cchangyou Did Thanks. |
@alishan2040 The load is not shared among GPUs, you'll need multiple GPUs with enough VRAM each |
@limiteinductive How much VRAM should be considered enough for a single GPU? Now I've 4 gpus with 16 GB VRAM each. Previously I had single GPU with 24 GBs VRAM. In both the cases, I faced memory errors. |
@alishan2040 I tried only using A100's |
Hi, if I follow the instruction to run image_train_latent.py, it seems only one GPU is used. Can you advise on how to use multiple GPUs? Thanks.
The text was updated successfully, but these errors were encountered: