Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Unified Checkpoint] Support expert parallel #9055

Merged
merged 4 commits into from
Oct 14, 2024

Conversation

DesmonDay
Copy link
Contributor

PR types

New features

PR changes

Others

Description

Support expert parallel for unified checkpoint.

Copy link

paddle-bot bot commented Aug 30, 2024

Thanks for your contribution!

Copy link

codecov bot commented Aug 30, 2024

Codecov Report

Attention: Patch coverage is 4.25532% with 135 lines in your changes missing coverage. Please review.

Project coverage is 53.24%. Comparing base (db270d9) to head (846bb24).
Report is 52 commits behind head on develop.

Files with missing lines Patch % Lines
paddlenlp/trainer/plugins/unified_checkpoint.py 4.25% 135 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #9055      +/-   ##
===========================================
- Coverage    53.29%   53.24%   -0.06%     
===========================================
  Files          652      652              
  Lines       105483   105599     +116     
===========================================
+ Hits         56222    56225       +3     
- Misses       49261    49374     +113     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ZHUI
Copy link
Collaborator

ZHUI commented Sep 13, 2024

ready for review?

@DesmonDay
Copy link
Contributor Author

DesmonDay commented Sep 13, 2024

ready for review?

yes, come!!!

try:
from paddle.base import core
except:
core = None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个是为了 预测版本 paddle?

expected_keys = set()
for key in model_state_dict.keys():
if getattr(model_state_dict[key], "no_sync", False):
expected_keys.add(key)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

非no_sync的QKV参数,是通过额外的广播加载? 模型,opt,master weight各是怎样?

@@ -1982,16 +2083,58 @@ def gather_sharded_object(index_file, total_size, is_optimizer=False):
return index_file_list, total_size_list


def rename_shard_file(args, shard_file, file_name):
"""rename shard file when using expert_parallel."""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

注释,解释一下?这里是还是按照原来的方式命名,修改一下? 为什么不是moe从新写一个?

def generate_base_static_name(vname):
# return base static name and specific type name, like [embedding_0.w_0, moment1_0]
if FP32_MASTER in vname:
vname = vname.split("_" + FP32_MASTER + "_")
return vname[0], vname[1]
else:
vname = vname.split(".")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

原来的代码,我看不太明白了。等价吗?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

embedding_0.w_0.moment1_0

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

moe_gate_1_moment1_0

Copy link
Collaborator

@ZHUI ZHUI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ZHUI ZHUI merged commit d46bc06 into PaddlePaddle:develop Oct 14, 2024
7 of 12 checks passed
DesmonDay added a commit to DesmonDay/PaddleNLP that referenced this pull request Oct 16, 2024
DesmonDay added a commit that referenced this pull request Oct 16, 2024
* [Unified Checkpoint] Support expert parallel (#9055)

* update code

* [Unified Checkpoint] Fix generation config save (#9223)

* [Unified Checkpoint] update async_save_info in develop (#9173)

* [Unified Checkpoint] update async save logic (#9274)

* update async save signal

* fix async save hang

* bug fix

---------

Co-authored-by: Weiguo Zhu <[email protected]>
DrownFish19 pushed a commit to DrownFish19/PaddleNLP that referenced this pull request Oct 16, 2024
wawltor pushed a commit that referenced this pull request Oct 17, 2024
* [Unified Checkpoint] Support expert parallel (#9055)

* update code

* [Unified Checkpoint] Fix generation config save (#9223)

* [Unified Checkpoint] update async_save_info in develop (#9173)

* [Unified Checkpoint] update async save logic (#9274)

* update async save signal

* fix async save hang

* bug fix

* bug fix

* [Trainer] fix save_model (#9286)

* bug fix

* bug fix

---------

Co-authored-by: Weiguo Zhu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants