Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Zipformer MVQ #1190

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open

Conversation

marcoyang1998
Copy link
Collaborator

This PR makes training with knowledge distillation an option in the Zipformer recipe. The knowledge distillation method is MVQ-KD.

The teacher targets can be downloaded via the following command:

./distillation_with_hubert.sh --stage 2 --stop_stage 2

To turn on knowledge distillation, you will need to set --enable-distillation True. It is applicable to both streaming and non-streaming Zipformers.

Detailed results will follow.

@marcoyang1998
Copy link
Collaborator Author

Some results:

100 hours:

model test-clean test-other
baseline, epoch-30-avg-9 5.97 15.73
+ mvq, epoch-30-avg-9 5.13 13.08

960 hours:

model test-clean test-other
baseline, epoch-30-avg-9 2.25 5.06
+ mvq, epoch-30-avg-9 2.18 4.86

sp=sp,
params=params,
)
# if not params.print_diagnostics:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there a particular reason this was commented?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was just for speed during my exp. Will make sure to un-comment them when I merge the PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants