L1 Pruning on resnet18 from mmpretrain #617
Unanswered
mathieu-charbonnel
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Good morning :)
I am new to pruning and to model compression in general and I have been playing with mmrazor and took as an example the following config from pretrain.
_base_ = ['mmpretrain::resnet/resnet18_8xb32_in1k.py']
I added to this config the pruning related elements
Then ran a short training on one image using
! timeout 2m python mmrazor/tools/train.py $pruning_cfg_path --work-dir 'demo/pruning'
Then run the following
and finally
Here is the onnx visualized on Netron. It has those gatherND operations everywhere, plus it seems the masks (from Mutable Channel Units ?) are still there. Is this really the deployed model in the final pruning stage or is there something I am missing ? I would imagine a model similar to the base Resnet18 but with lower number of channels for considered convolution layers.
Thank you very much in advance :),
Mathieu
Beta Was this translation helpful? Give feedback.
All reactions