This repository is created for mid-term project of DATA620004, Fudan University. The main part of the code is forked from https://github.com/Tramac/awesome-semantic-segmentation-pytorch, which implements a various of advanced deep learning models for the sematic segmentation task in pytorch. For our mid-term project, we train the BiSeNet model on cityscapes supervised by different loss functions and try to explore the influence brought by them. For the simplicity of reproduce, we will introduce how to install the package, how to train and evaluate the model as follows.
# semantic-segmentation-pytorch dependencies
pip install ninja tqdm
# follow PyTorch installation in https://pytorch.org/get-started/locally/
conda install pytorch torchvision -c pytorch
# install PyTorch Segmentation
git clone https://github.com/Tramac/awesome-semantic-segmentation-pytorch.git
# the following will install the lib with symbolic links, so that you can modify
# the files if you want and won't need to re-build it
cd awesome-semantic-segmentation-pytorch/core/nn
python setup.py build develop
- Use CrossEntropy
# for example, use single GPU and set batchsize as 8:
python train.py --model bisenet --backbone resnet18 --dataset citys --lr 0.01 --epochs 80 --batch-size 8
- Use Auxiliary Loss
# for example, use single GPU and set batchsize as 8:
python train.py --model bisenet --backbone resnet18 --dataset citys --lr 0.01 --epochs 80 --batch-size 8 --aux
- Use Ohem CrossEntropy
# for example, use multi GPUs and set batchzie as 32:
export NGPUS=4
python -m torch.distributed.launch --nproc_per_node=$NGPUS train.py --model bisenet --backbone resnet18 --dataset citys --lr 0.025 --epochs 100 --use-ohem True
- Single GPU evaluating
# for example, evaluate bisenet_resnet18_citys with single GPU:
python eval.py --model bisenet --backbone resnet18 --dataset citys
- Multi-GPU evaluating
# for example, evaluate bisenet_resnet18_citys_aux with 4 GPUs:
export NGPUS=4
python -m torch.distributed.launch --nproc_per_node=$NGPUS eval.py --model bisenet --backbone resnet18 --dataset citys --aux
cd ./scripts
python demo.py --model fcn32s_vgg16_voc --input-pic ./datasets/test.jpg
.{SEG_ROOT}
├── scripts
│ ├── demo.py
│ ├── eval.py
│ └── train.py
cd ./runs
tensorboard --logdir tensorboardx
- FCN
- ENet
- PSPNet
- ICNet
- DeepLabv3
- DeepLabv3+
- DenseASPP
- EncNet
- BiSeNet
- PSANet
- DANet
- OCNet
- CGNet
- ESPNetv2
- CCNet
- DUNet(DUpsampling)
- FastFCN(JPU)
- LEDNet
- Fast-SCNN
- LightSeg
- DFANet
DETAILS for model & backbone.
.{SEG_ROOT}
├── core
│ ├── models
│ │ ├── bisenet.py
│ │ ├── danet.py
│ │ ├── deeplabv3.py
│ │ ├── deeplabv3+.py
│ │ ├── denseaspp.py
│ │ ├── dunet.py
│ │ ├── encnet.py
│ │ ├── fcn.py
│ │ ├── pspnet.py
│ │ ├── icnet.py
│ │ ├── enet.py
│ │ ├── ocnet.py
│ │ ├── ccnet.py
│ │ ├── psanet.py
│ │ ├── cgnet.py
│ │ ├── espnet.py
│ │ ├── lednet.py
│ │ ├── dfanet.py
│ │ ├── ......
You can run script to download dataset, such as:
cd ./core/data/downloader
python ade20k.py --download-dir ../datasets/ade
Dataset | training set | validation set | testing set |
---|---|---|---|
VOC2012 | 1464 | 1449 | ✘ |
VOCAug | 11355 | 2857 | ✘ |
ADK20K | 20210 | 2000 | ✘ |
Cityscapes | 2975 | 500 | ✘ |
COCO | |||
SBU-shadow | 4085 | 638 | ✘ |
LIP(Look into Person) | 30462 | 10000 | 10000 |
.{SEG_ROOT}
├── core
│ ├── data
│ │ ├── dataloader
│ │ │ ├── ade.py
│ │ │ ├── cityscapes.py
│ │ │ ├── mscoco.py
│ │ │ ├── pascal_aug.py
│ │ │ ├── pascal_voc.py
│ │ │ ├── sbu_shadow.py
│ │ └── downloader
│ │ ├── ade20k.py
│ │ ├── cityscapes.py
│ │ ├── mscoco.py
│ │ ├── pascal_voc.py
│ │ └── sbu_shadow.py
- PASCAL VOC 2012
Loss | Backbone | TrainSet | EvalSet | Learning rate | epochs | JPU | Mean IoU | pixAcc |
---|---|---|---|---|---|---|---|---|
CrossEntropy | resnet18 | train | val | 0.01 | 80 | ✘ | 63.555 | 93.459 |
CrossEntropy+aux | resnet18 | train | val | 0.01 | 80 | ✘ | 64.406 | 93.642 |
Ohem | resnet18 | train | val | 0.025 | 100 | ✘ | 63.130 | 93.113 |
Ohem+aux | resnet18 | train | val | 0.025 | 100 | ✘ | 63.724 | 93.044 |
Note: lr=1e-4, batch_size=4, epochs=80
.
See TEST for details.
.{SEG_ROOT}
├── tests
│ └── test_model.py