Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Basic distilling. #6527

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

marko1616
Copy link
Contributor

What does this PR do?

Fixes # (issue)

Before submitting

@hiyouga hiyouga self-requested a review January 8, 2025 05:01
else:
self.processing_class: "PreTrainedTokenizer" = kwargs.get("tokenizer")

self.teacher_model = teacher_model
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it work on DDP setting?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm working on deepspeed. Default DDP is working.

Copy link
Contributor Author

@marko1616 marko1616 Jan 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Anyway I will not gonna using GKDTrainer in trl>=11.0 because this can't use for mllm.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is working on deepspeed but require deepspeed==0.15.4

src/llamafactory/train/distilling/trainer.py Outdated Show resolved Hide resolved
@hiyouga hiyouga added the pending This problem is yet to be addressed label Jan 8, 2025
@hiyouga hiyouga self-requested a review January 12, 2025 09:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pending This problem is yet to be addressed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants