Skip to content

A-Kerim/Using-synthetic-data-for-person-tracking-under-adverse-weather-conditions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Using Synthetic Data for Person Tracking under Adverse Weather Conditions

Highlights

• A novel real dataset for pedestrian tracking under adverse weather conditions

• Showed state-of-the-art trackers perform poorly under adverse weather conditions

• Procedurally generated a synthetic dataset covering adverse weather conditions

• Our synthetic dataset boosts trackers performance with adverse weather videos

Authors

Abdulrahman Kerim, Lancaster University, a.kerim@lancaster.ac.uk

Ufuk Celikcan, Hacettepe University

Erkut Erdem, Hacettepe University

Aykut Erdem, Koç University

Abstract

Robust visual tracking plays a vital role in many areas such as autonomous cars, surveillance and robotics. Recent trackers were shown to achieve adequate results under normal tracking scenarios with clear weather conditions, standard camera setups and lighting conditions. Yet, the performance of these trackers, whether they are correlation filter-based or learning-based, degrade under adverse weather conditions. The lack of videos with such weather conditions, in the available visual object tracking datasets, is the prime issue behind the low performance of the learning-based tracking algorithms. In this work, we provide a new person tracking dataset of real-world sequences (PTAW172Real) captured under foggy, rainy and snowy weather conditions to assess the performance of the current trackers. We also introduce a novel person tracking dataset of synthetic sequences (PTAW217Synth) procedurally generated by our NOVA framework spanning the same weather conditions in varying severity to mitigate the problem of data scarcity. Our experimental results demonstrate that the performances of the state-of-the-art deep trackers under adverse weather conditions can be boosted when the available real training sequences are complemented with our synthetically generated dataset during training.

Acknowledgements

This work was supported in part by TUBITAK-1001 Program (Grant No. 217E029), GEBIP 2018 fellowship of Turkish Academy of Sciences awarded to E. Erdem, and BAGEP 2021 Award of the Science Academy awarded to A. Erdem.

Citation

  • Paper published at Image and Vision Computing Journal.
@article{KERIM2021104187,
title = {Using synthetic data for person tracking under adverse weather conditions},
journal = {Image and Vision Computing},
volume = {111},
pages = {104187},
year = {2021},
issn = {0262-8856},
doi = {https://doi.org/10.1016/j.imavis.2021.104187},
url = {https://www.sciencedirect.com/science/article/pii/S0262885621000925},
author = {Abdulrahman Kerim and Ufuk Celikcan and Erkut Erdem and Aykut Erdem},
keywords = {Person tracking, Synthetic data, Rendering, Procedural generation},
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published