About the compatibility of Colossal and torch.distributed. #3460
Unanswered
ZhuXMMM
asked this question in
Community | Q&A
Replies: 1 comment
-
We maintain the similar information of all distributed groups in a global context. You shall use |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In my code, the original distributed sequences were using functions such as torch.distributed.all_reduce(), torch.distributed.get_backend(), torch.distributed.all_gather(), torch.distributed.is_available(), etc. I want to ask whether these functions are compatible with Colossal. If I want to replace them, which functions in Colossal should I use, or how can I find out?
Beta Was this translation helpful? Give feedback.
All reactions