Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QA] 用Internevo已经训练出来了一个7B模型,如何用这个internevo权重跑MoE? #251

Open
Cerberous opened this issue Jun 17, 2024 · 1 comment
Assignees
Labels
question Further information is requested

Comments

@Cerberous
Copy link

Describe the question.

我用internevo跑了一个7B的模型,拿到了一个internevo的模型权重,现在我要基于这个权重跑一个MoE的模型,我发现load进来会报这个错,请问如何解决?
AssertionError: /beegfs/workspace/nlp/leo/model_ckpt/7B_v7/715255/model_moe_layer0_expert0_tp0.pt is not found!

@Cerberous Cerberous added the question Further information is requested label Jun 17, 2024
@gaoyang07 gaoyang07 assigned sunpengsdu and unassigned yhcc Jun 18, 2024
@blankde
Copy link
Collaborator

blankde commented Jun 28, 2024

@Cerberous 需要一个7B dense模型到MoE模型的转换脚本,下周会开源到仓库。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants