Hawkeye is a unified deep learning based fine-grained image recognition toolbox built on PyTorch, which is designed for researchers and engineers. Currently, Hawkeye contains representative fine-grained recognition methods of different paradigms, including utilizing deep filters, leveraging attention mechanisms, performing high-order feature interactions, designing specific loss functions, recognizing with web data, as well as miscellaneous.
Nov 01, 2022: Our Hawkeye is launched!
The following methods are placed in model/methods
and the corresponding losses are placed in model/loss
.
The table of experimental results for the following methods on CUB-200
can be found in the results.csv
file.
Except for the asterisked methods,
- Utilizing Deep Filters
- Leveraging Attention Mechanisms
- Performing High-Order Feature Interactions
- Designing Specific Loss Functions
- Recognition with Web Data
- Miscellaneous
We provide a brief tutorial for Hawkeye.
git clone https://github.com/Hawkeye-FineGrained/Hawkeye.git
cd Hawkeye
- Python 3.8
- PyTorch 1.11.0 or higher
- torchvison 0.12.0 or higher
- numpy
- yacs
- tqdm
Eight representative fine-grained recognition benchmark datasets are provided as follows.
FGDataset name | Year | Meta-class | # images | # categories | Download Link |
---|---|---|---|---|---|
CUB-200 | 2011 | Birds | 11,788 | 200 | https://data.caltech.edu/records/65de6-vp158/files/CUB_200_2011.tgz |
Stanford Dog | 2011 | Dogs | 20,580 | 120 | http://vision.stanford.edu/aditya86/ImageNetDogs/images.tar |
Stanford Car | 2013 | Cars | 16,185 | 196 | http://ai.stanford.edu/~jkrause/car196/car_ims.tgz |
FGVC Aircraft | 2013 | Aircrafts | 10,000 | 100 | https://www.robots.ox.ac.uk/~vgg/data/fgvc-aircraft/archives/fgvc-aircraft-2013b.tar.gz |
iNat2018 | 2018 | Plants & Animals | 461,939 | 8,142 | https://ml-inat-competition-datasets.s3.amazonaws.com/2018/train_val2018.tar.gz |
WebFG-bird | 2021 | Birds | 18,388 | 200 | https://web-fgvc-496-5089-sh.oss-cn-shanghai.aliyuncs.com/web-bird.tar.gz |
WebFG-car | 2021 | Cars | 21,448 | 196 | https://web-fgvc-496-5089-sh.oss-cn-shanghai.aliyuncs.com/web-car.tar.gz |
WebFG-aircraft | 2021 | Aircrafts | 13,503 | 100 | https://web-fgvc-496-5089-sh.oss-cn-shanghai.aliyuncs.com/web-aircraft.tar.gz |
You can download dataset to the data/
directory by conducting the following operations. We here take CUB-200
as an example.
cd Hawkeye/data
wget https://data.caltech.edu/records/65de6-vp158/files/CUB_200_2011.tgz
mkdir bird && tar -xvf CUB_200_2011.tgz -C bird/
We provide the meta-data file of the datasets in metadata/
, and the train list and the val list are also provided according to the official splittings of the dataset. There is no need to modify the decompressed directory of the dataset. The following is an example of the directory structure of two datasets.
data
βββ bird
β βββ CUB_200_2011
β β βββ images
β β β βββ 001.Black_footed_Albatross
β β β β βββ Black_Footed_Albatross_0001_796111.jpg
β β β β βββ Β·Β·Β·
β β β βββ Β·Β·Β·
β β βββ Β·Β·Β·
β βββ Β·Β·Β·
βββ web-car
β βββ train
β β βββ Acura Integra Type R 2001
β β β βββ Acura Integra Type R 2001_00001.jpg
β β β βββ Β·Β·Β·
β βββ val
β β βββ Acura Integra Type R 2001
β β β βββ 000450.jpg
β β β βββ Β·Β·Β·
β β βββ Β·Β·Β·
β βββ Β·Β·Β·
βββ Β·Β·Β·
When using different datasets, you need to modify the dataset path in the corresponding config file. meta_dir
is the path to the meta-data file which contains train list and val list. root_dir
is the path to the image folder in data/
. Here are two examples.
Note that the relative path in the meta-data list should match the path of
root_dir
.
dataset:
name: cub
root_dir: data/bird/CUB_200_2011/images
meta_dir: metadata/cub
dataset:
name: web_car
root_dir: data/web-car
meta_dir: metadata/web_car
Note that, for ProtoTree, it was trained on an offline augment dataset, refer to the link if needed. We just provide meta-data for the offline augmented cub-200 in
metadata/cub_aug
.
For each method in the repo, we provide separate training example files in the Examples/
directory.
-
For example, the command to train an APINet:
python Examples/APINet.py --config configs/APINet.yaml
The default parameters of the experiment are shown in
configs/APINet.yaml
.
Some methods require multi-stage training.
-
For example, when training BCNN, two stages of training are required, cf. its two config files.
First, the first stage of model training is performed by:
python Examples/BCNN.py --config configs/BCNN_S1.yaml
Then, the second stage of training is performed later. You need to modify the weight path of the model (
load
inBCNN_S2.yaml
) to load the model parameters obtained from the first stage of training, such asresults/bcnn/bcnn_cub s1/best_model.pth
.python Examples/BCNN.py --config configs/BCNN_S2.yaml
In addition, specific parameters of each method are also commented in their configs.
We provide sample codes to test a model, you can run the command to test BCNN:
python test.py --config configs/test.yaml
You can modify test.py
and test.yaml
to test other models.
This project is released under the MIT license.
If you have any questions about our work, please do not hesitate to contact us by emails.
Xiu-Shen Wei: [email protected]
Jiabei He: [email protected]
Yang Shen: [email protected]
This project is supported by National Key R&D Program of China (2021YFA1001100), National Natural Science Foundation of China under Grant (62272231), Natural Science Foundation of Jiangsu Province of China under Grant (BK20210340), and the Fundamental Research Funds for the Central Universities (No. 30920041111, No. NJ2022028).