From f5adc8ad843afce2b1d50d349359a8b3afe13e08 Mon Sep 17 00:00:00 2001 From: VainF <2218880241@qq.com> Date: Sat, 22 Jul 2023 00:49:04 +0800 Subject: [PATCH] update readme --- README.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 0676a67b..0dcb9ce1 100644 --- a/README.md +++ b/README.md @@ -12,14 +12,14 @@ Tested PyTorch Versions License Downloads - Latest Version + Latest Version Open In Colab arXiv

-Torch-Pruning (TP) is a library for structural pruning that enables the following features: +Torch-Pruning (TP) is a library for structural pruning with the following features: * **General-purpose Pruning Toolkit:** TP enables structural pruning for a wide range of deep neural networks, including *[Large Language Models (LLMs)](https://github.com/horseee/LLM-Pruner), [Diffusion Models](https://github.com/VainF/Diff-Pruning), [Yolov7](examples/yolov7/), [yolov8](examples/yolov8/), [ViT](examples/torchvision_models/), FasterRCNN, SSD, ResNe(X)t, ConvNext, DenseNet, ConvNext, RegNet, DeepLab, etc*. Different from [torch.nn.utils.prune](https://pytorch.org/tutorials/intermediate/pruning_tutorial.html) that zeroizes parameters through masking, Torch-Pruning deploys a (non-deep) graph algorithm called **DepGraph** to remove parameters physically. Currently, TP is able to prune approximately **81/85=95.3%** of the models from Torchvision 0.13.1. Try this [Colab Demo](https://colab.research.google.com/drive/1TRvELQDNj9PwM-EERWbF3IQOyxZeDepp?usp=sharing) for a quick start. * **[Performance Benchmark](benchmarks)**: Reproduce the our results in the DepGraph paper. @@ -41,7 +41,7 @@ Please do not hesitate to open a [discussion](https://github.com/VainF/Torch-Pru ### **Features:** - [x] Structural pruning for CNNs, Transformers, Detectors, Language Models and Diffusion Models. Please refer to the [examples](examples). -- [x] High-level pruners: [MagnitudePruner](https://arxiv.org/abs/1608.08710), [BNScalePruner](https://arxiv.org/abs/1708.06519), [GroupNormPruner](https://arxiv.org/abs/2301.12900), RandomPruner, etc. +- [x] High-level pruners: [MagnitudePruner](https://arxiv.org/abs/1608.08710), [BNScalePruner](https://arxiv.org/abs/1708.06519), [GroupNormPruner](https://arxiv.org/abs/2301.12900), [GrowingRegPruner](https://arxiv.org/abs/2012.09243), RandomPruner, etc. - [x] Importance Criteria: L-p Norm, Taylor, Random, BNScaling, etc. - [x] Dependency Graph for automatic structrual pruning - [x] Supported modules: Linear, (Transposed) Conv, Normalization, PReLU, Embedding, MultiheadAttention, nn.Parameters and [customized modules](tests/test_customized_layer.py). @@ -53,7 +53,7 @@ Please do not hesitate to open a [discussion](https://github.com/VainF/Torch-Pru - [ ] A strong baseline with bags of tricks from existing methods. - [ ] A benchmark for [Torchvision](https://pytorch.org/vision/stable/models.html) compatibility (**81/85=95.3%**, :heavy_check_mark:) and [timm](https://github.com/huggingface/pytorch-image-models) compatibility. - [ ] Pruning from Scratch / at Initialization. -- [ ] More high-level pruners like [FisherPruner](https://arxiv.org/abs/2108.00708), [GrowingReg](https://arxiv.org/abs/2012.09243), etc. +- [ ] More high-level pruners like [FisherPruner](https://arxiv.org/abs/2108.00708), etc. - [ ] More Transformers like Vision Transformers (:heavy_check_mark:), Swin Transformers, PoolFormers. - [ ] Block/Layer/Depth Pruning - [ ] Pruning benchmarks for CIFAR, ImageNet and COCO. @@ -146,7 +146,7 @@ for group in DG.get_all_groups(ignored_layers=[model.conv1], root_module_types=[ ### 2. High-level Pruners -Leveraging the DependencyGraph, we developed several high-level pruners in this repository to facilitate effortless pruning. By specifying the desired channel sparsity, the pruner will scan all prunable groups, prune the entire model, and fine-tune it using your own training code. For detailed information on this process, please refer to [this tutorial](https://github.com/VainF/Torch-Pruning/blob/master/tutorials/1%20-%20Customize%20Your%20Own%20Pruners.ipynb), which shows how to implement a [slimming](https://arxiv.org/abs/1708.06519) pruner from scratch. Additionally, a more practical example is available in [benchmarks/main.py](benchmarks/main.py). +Leveraging the DependencyGraph, we developed several high-level pruners in this repository to facilitate effortless pruning. By specifying the desired channel sparsity, the pruner will scan all prunable groups, prune the entire model, and fine-tune it using your own training code. For detailed information on this process, please refer to [this tutorial](https://github.com/VainF/Torch-Pruning/blob/master/examples/notebook/1%20-%20Customize%20Your%20Own%20Pruners.ipynb), which shows how to implement a [slimming](https://arxiv.org/abs/1708.06519) pruner from scratch. Additionally, a more practical example is available in [benchmarks/main.py](benchmarks/main.py). ```python import torch