运行2D张量并行示例报错 #814
Answered
by
kurisusnowdeng
chinoll
asked this question in
Community | Q&A
-
在运行2D张量并行的示例的时候报错了。
命令行参数 python -m torch.distributed.launch --nproc_per_node 4 --master_addr localhost --master_port 29500 test.py 配置和示例中的一样,请问该怎么解决呢 |
Beta Was this translation helpful? Give feedback.
Answered by
kurisusnowdeng
Apr 20, 2022
Replies: 2 comments 10 replies
-
Hi, 请问你的ColossalAI版本是多少呢? |
Beta Was this translation helpful? Give feedback.
1 reply
-
@chinoll hi, 2d并行的初始化正确吗?可以用以下代码查看 from colossalai.core import global_context as gpc
from colossalai.context import ParallelMode
assert gpc.is_initialized(ParallelMode.PARALLEL_2D_COL)
assert gpc.is_initialized(ParallelMode.PARALLEL_2D_ROW) |
Beta Was this translation helpful? Give feedback.
9 replies
Answer selected by
chinoll
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@chinoll hi, 2d并行的初始化正确吗?可以用以下代码查看