SeqTrack: Sequence to Sequence Learning for Visual Object Tracking
accepted by CVPR2023
Xin Chen, Houwen Peng, Dong Wang, Huchuan Lu, Han Hu
This is an official pytorch implementation of the CVPR2023 paper SeqTrack: Sequence to Sequence Learning for Visual Object Tracking, a new framework for visual object tracking.
SeqTrack models tracking as a sequence generation task. If the model knows where the target object is, we could simply teach it how to read the bounding box out.
SeqTrack only adopts a plain encoder-decoder transformer architecture with a simple cross-entropy loss.
Tracker | LaSOT (AUC) | GOT-10K (AO) | TrackingNet (AUC) |
---|---|---|---|
SeqTrack | 72.5 | 74.8 | 85.5 |
OSTrack | 71.1 | 73.7 | 83.9 |
SimTrack | 70.5 | 69.8 | 83.4 |
Mixformer | 70.1 | 70.7 | 83.9 |
conda create -n seqtrack python=3.8
conda activate seqtrack
bash install.sh
- Add the project path to environment variables
export PYTHONPATH=<absolute_path_of_SeqTrack>:$PYTHONPATH
Put the tracking datasets in ./data. It should look like:
${SeqTrack_ROOT}
-- data
-- lasot
|-- airplane
|-- basketball
|-- bear
...
-- got10k
|-- test
|-- train
|-- val
-- coco
|-- annotations
|-- images
-- trackingnet
|-- TRAIN_0
|-- TRAIN_1
...
|-- TRAIN_11
|-- TEST
Run the following command to set paths for this project
python tracking/create_default_local_file.py --workspace_dir . --data_dir ./data --save_dir .
After running this command, you can also modify paths by editing these two files
lib/train/admin/local.py # paths about training
lib/test/evaluation/local.py # paths about testing
python -m torch.distributed.launch --nproc_per_node 8 lib/train/run_training.py --script seqtrack --config seqtrack_b256 --save_dir .
(Optionally) Debugging training with a single GPU
python tracking/train.py --script seqtrack --config seqtrack_b256 --save_dir . --mode single
- LaSOT
python tracking/test.py seqtrack seqtrack_b256 --dataset lasot --threads 2
python tracking/analysis_results.py # need to modify tracker configs and names
- GOT10K-test
python tracking/test.py seqtrack seqtrack_b256_got --dataset got10k_test --threads 2
python lib/test/utils/transform_got10k.py --tracker_name seqtrack --cfg_name seqtrack_b256_got
- TrackingNet
python tracking/test.py seqtrack seqtrack_b256 --dataset trackingnet --threads 2
python lib/test/utils/transform_trackingnet.py --tracker_name seqtrack --cfg_name seqtrack_b256
- TNL2K
python tracking/test.py seqtrack seqtrack_b256 --dataset trackingnet --threads 2
python tracking/analysis_results.py # need to modify tracker configs and names
- UAV123
python tracking/test.py seqtrack seqtrack_b256 --dataset uav --threads 2
python tracking/analysis_results.py # need to modify tracker configs and names
- NFS
python tracking/test.py seqtrack seqtrack_b256 --dataset nfs --threads 2
python tracking/analysis_results.py # need to modify tracker configs and names
- VOT2020
Before evaluating "SeqTrack+AR" on VOT2020, please install some extra packages following external/AR/README.md
cd external/vot20/<workspace_dir>
export PYTHONPATH=<path to the seqtrack project>:$PYTHONPATH
vot evaluate --workspace . seqtrack_b256_ar
vot analysis --nocache
# Profiling SeqTrack-B256 model
python tracking/profile_model.py --script seqtrack --config seqtrack_b256
The trained models, and the raw tracking results are provided in the model zoo
- This codebase is implemented on STARK and PyTracking libraries, also refers to Stable-Pix2Seq, and borrows AlphaRefine for VOT evaluation. We would like to thank their authors for providing great libraries.