You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear author,
Thank you for your precious blogs and repo. I am learning about quantization and your tutorial is very helpful. However, I just wonder why we need to fuse this module. # Fuse the model in place rather manually. fused_model = torch.quantization.fuse_modules(fused_model, [["conv1", "bn1", "relu"]], inplace=True) for module_name, module in fused_model.named_children(): if "layer" in module_name: for basic_block_name, basic_block in module.named_children(): torch.quantization.fuse_modules(basic_block, [["conv1", "bn1", "relu1"], ["conv2", "bn2"]], inplace=True) for sub_block_name, sub_block in basic_block.named_children(): if sub_block_name == "downsample": torch.quantization.fuse_modules(sub_block, [["0", "1"]], inplace=True). Can you address it for me. Thank you
The text was updated successfully, but these errors were encountered:
Dear author,
Thank you for your precious blogs and repo. I am learning about quantization and your tutorial is very helpful. However, I just wonder why we need to fuse this module.
# Fuse the model in place rather manually. fused_model = torch.quantization.fuse_modules(fused_model, [["conv1", "bn1", "relu"]], inplace=True) for module_name, module in fused_model.named_children(): if "layer" in module_name: for basic_block_name, basic_block in module.named_children(): torch.quantization.fuse_modules(basic_block, [["conv1", "bn1", "relu1"], ["conv2", "bn2"]], inplace=True) for sub_block_name, sub_block in basic_block.named_children(): if sub_block_name == "downsample": torch.quantization.fuse_modules(sub_block, [["0", "1"]], inplace=True)
. Can you address it for me. Thank youThe text was updated successfully, but these errors were encountered: