Publication:
Using synthetic data for person tracking under adverse weather conditions

dc.contributor.coauthorKerim, Abdulrahman
dc.contributor.coauthorÇelikcan, Ufuk
dc.contributor.coauthorErdem, Erkut
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorErdem, Aykut
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokid20331
dc.date.accessioned2024-11-10T00:01:37Z
dc.date.issued2021
dc.description.abstractRobust visual tracking plays a vital role in many areas such as autonomous cars, surveillance and robotics. Recent trackers were shown to achieve adequate results under normal tracking scenarios with clear weather conditions, standard camera setups and lighting conditions. Yet, the performance of these trackers, whether they are corre-lation filter-based or learning-based, degrade under adverse weather conditions. The lack of videos with such weather conditions, in the available visual object tracking datasets, is the prime issue behind the low perfor-mance of the learning-based tracking algorithms. In this work, we provide a new person tracking dataset of real-world sequences (PTAW172Real) captured under foggy, rainy and snowy weather conditions to assess the performance of the current trackers. We also introduce a novel person tracking dataset of synthetic sequences (PTAW217Synth) procedurally generated by our NOVA framework spanning the same weather conditions in varying severity to mitigate the problem of data scarcity. Our experimental results demonstrate that the perfor-mances of the state-of-the-art deep trackers under adverse weather conditions can be boosted when the avail-able real training sequences are complemented with our synthetically generated dataset during training. (c) 2021 Elsevier B.V. All rights reserved.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsorshipTUBITAK-1001 Program [217E029]
dc.description.sponsorshipGEBIP 2018 fellowship of Turkish Academy of Sciences
dc.description.sponsorshipBAGEP 2021 Award of the Science Academy This work was supported in part by TUBITAK-1001 Program (Grant No. 217E029), GEBIP 2018 fellowship of Turkish Academy of Sciences awarded to E. Erdem, and BAGEP 2021 Award of the Science Academy awarded to A. Erdem.
dc.description.volume111
dc.identifier.doi10.1016/j.imavis.2021.104187
dc.identifier.eissn1872-8138
dc.identifier.issn0262-8856
dc.identifier.quartileQ1
dc.identifier.scopus2-s2.0-85105274958
dc.identifier.urihttp://dx.doi.org/10.1016/j.imavis.2021.104187
dc.identifier.urihttps://hdl.handle.net/20.500.14288/15997
dc.identifier.wos658385700012
dc.keywordsPerson tracking
dc.keywordsSynthetic data
dc.keywordsRendering
dc.keywordsProcedural generation
dc.languageEnglish
dc.publisherElsevier
dc.sourceImage and Vision Computing
dc.subjectComputer science
dc.subjectArtificial intelligence
dc.subjectEngineering
dc.subjectSoftware engineering
dc.subjectTheory methods
dc.subjectEngineering
dc.subjectElectrical electronic engineering
dc.subjectOptics
dc.titleUsing synthetic data for person tracking under adverse weather conditions
dc.typeJournal Article
dspace.entity.typePublication
local.contributor.authorid0000-0002-6280-8422
local.contributor.kuauthorErdem, Aykut
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files