-
Notifications
You must be signed in to change notification settings - Fork 112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Explicitly set batch=1 for NNCF in order to avoid issue with Wave2Vec #312
Conversation
@AlexKoff88 @yujiepan-work @vuiseng9 please take a look |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am mostly ok with the changes
@echarlaix Could this be merged before the upcoming dot release? |
The documentation is not available anymore as the PR was closed or merged. |
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the fix @ljaljushkin ! Can you rebase as well as run the make style
command to fix the tests
Done |
What does this PR do?
Explicitly set batch size equal 1 in the NNCFConfig in order to avoid issue with Wave2Vec with the upcoming NNCF release.
With switching to NNCF v2.5.0, JPQD uses automatic structured pruning algorithm and currently it doesn't work properly with batch > 1 for Wave2Vec model.
It works fine when batch is 1. With this PR it is set in the NNCFConfig only, it should not affect training and inference with different batch sizes, since it's only used for NNCF internals to set up compression.
The proper fix for the Wave2Vec will be introduced in the next releases of NNCF. Here's the draft (openvinotoolkit/nncf#1784) that will be finalized later.
The fix is checked on the existing NNCF, JPQD tests.
Fixes # (issue)
Ticket 110126 in the internal NNCF tracking system