You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi ,
I have some problem when I try to use the model I trained.
The first question is that , I use the config of vit-h and my GPU is rtx A6000 48g, when I train the model with the vit_h pretrained model , there are no error. But after trained ,when I want to use the model I trained to run test.py , the cuda will out of memory. I aready try batch_size=1 but the error still happen.
The another question is when I try to use 2 gpu to train or test the model whit the .pth I saved , the first GPU will run all local_rank process , and this condition will make the cuda out of memory.
The text was updated successfully, but these errors were encountered:
Hi ,
I have some problem when I try to use the model I trained.
The first question is that , I use the config of vit-h and my GPU is rtx A6000 48g, when I train the model with the vit_h pretrained model , there are no error. But after trained ,when I want to use the model I trained to run test.py , the cuda will out of memory. I aready try batch_size=1 but the error still happen.
The another question is when I try to use 2 gpu to train or test the model whit the .pth I saved , the first GPU will run all local_rank process , and this condition will make the cuda out of memory.
The text was updated successfully, but these errors were encountered: