Monocular-Depth-Estimation-Toolbox is an open source monocular depth estimation toolbox based on PyTorch and MMSegmentation v0.16.0.
It aims to benchmark MonoDepth methods and provides effective supports for evaluating and visualizing results.
-
Unified benchmark
Provide a unified benchmark toolbox for various depth estimation methods.
-
Modular design
Depth estimation frameworks are decomposed into different components. One can easily construct a customized framework by combining different modules.
-
Support of multiple methods out of box
I would like to reproduce some of the most excellent depth estimation methods based on this toolbox.
-
High efficiency
It seems that there are few depth estimation benchmarks, so I start this project and hope it is helpful for research.
Thanks to MMSeg, we own these major features. 😊
Results and models are available in the model zoo (TODO).
Supported backbones (partially release):
- ResNet (CVPR'2016)
- EfficientNet (ICML'2019)
- SwinTransformer (ICCV'2021)
- I recommend cross-package import in config, so that you can utilize other backbone in MMcls, MMseg, etc. Refer to introduction. I will add more backbones in the future.
Supported methods:
Supported datasets:
Please refer to get_started.md for installation and dataset_prepare.md for dataset preparation.
Please see introductions and tutorials of MMSegmentation for the basic knowledge of our toolbox. Then, we provide train.md and inference.md (TODO) for the usage of this toolbox. There are also tutorials for customizing dataset (TODO), designing data pipeline (TODO), customizing modules (TODO), and customizing runtime (TODO). We also provide many training tricks (TODO).
-
I am currently busy with other projects, so more detailed docs about introductions of this toolbox will be presented in the future. If you are interested in this project but do not know how to start, you can first refer to docs of OpenMMLab's next-generation platform.
-
Many annotations in codes are futile, waiting to be rewritten.
-
I will release codes of our other engaging work based on this toolbox.