Ajou University repository

3d vehicle trajectory extraction using dcnn in an overlapping multi-camera crossroad sceneoa mark
Citations

SCOPUS

2

Citation Export

DC Field Value Language
dc.contributor.authorHeo, Jinyeong-
dc.contributor.authorKwon, Yongjin-
dc.date.issued2021-12-01-
dc.identifier.issn1424-8220-
dc.identifier.urihttps://dspace.ajou.ac.kr/dev/handle/2018.oak/32393-
dc.description.abstractThe 3D vehicle trajectory in complex traffic conditions such as crossroads and heavy traffic is practically very useful in autonomous driving. In order to accurately extract the 3D vehicle trajectory from a perspective camera in a crossroad where the vehicle has an angular range of 360 degrees, problems such as the narrow visual angle in single-camera scene, vehicle occlusion under conditions of low camera perspective, and lack of vehicle physical information must be solved. In this paper, we propose a method for estimating the 3D bounding boxes of vehicles and extracting trajectories using a deep convolutional neural network (DCNN) in an overlapping multi-camera crossroad scene. First, traffic data were collected using overlapping multi-cameras to obtain a wide range of trajectories around the crossroad. Then, 3D bounding boxes of vehicles were estimated and tracked in each single-camera scene through DCNN models (YOLOv4, multi-branch CNN) combined with camera calibration. Using the abovementioned information, the 3D vehicle trajectory could be extracted on the ground plane of the crossroad by calculating results obtained from the overlapping multi-camera with a homography matrix. Finally, in experiments, the errors of extracted trajectories were corrected through a simple linear interpolation and regression, and the accuracy of the proposed method was verified by calculating the difference with ground-truth data. Compared with other previously reported methods, our approach is shown to be more accurate and more practical.-
dc.description.sponsorshipFunding: This research was supported by the Unmanned Vehicles Core Technology Research and Development Program through the National Research Foundation of Korea (NRF) and Unmanned Vehicle Advanced Research Center (UVARC) funded by the Ministry of Science and ICT, the Republic of Korea (Grant Number: 2020M3C1C1A01084900).-
dc.language.isoeng-
dc.publisherMDPI-
dc.subject.mesh3-D trajectory-
dc.subject.mesh3d bounding box estimation-
dc.subject.mesh3d trajectory extraction-
dc.subject.meshBounding-box-
dc.subject.meshCamera calibration-
dc.subject.meshMulti-cameras-
dc.subject.meshMulti-object tracking-
dc.subject.meshOverlapping multi-camera crossroad scene-
dc.subject.meshTrajectory extraction-
dc.subject.meshVehicle trajectories-
dc.title3d vehicle trajectory extraction using dcnn in an overlapping multi-camera crossroad scene-
dc.typeArticle-
dc.citation.titleSensors-
dc.citation.volume21-
dc.identifier.bibliographicCitationSensors, Vol.21-
dc.identifier.doi10.3390/s21237879-
dc.identifier.pmid34883887-
dc.identifier.scopusid2-s2.0-85119718671-
dc.identifier.urlhttps://www.mdpi.com/1424-8220/21/23/7879/pdf-
dc.subject.keyword3D bounding box estimation-
dc.subject.keyword3D trajectory extraction-
dc.subject.keywordCamera calibration-
dc.subject.keywordMulti-object tracking-
dc.subject.keywordOverlapping multi-camera crossroad scene-
dc.description.isoatrue-
dc.subject.subareaAnalytical Chemistry-
dc.subject.subareaInformation Systems-
dc.subject.subareaAtomic and Molecular Physics, and Optics-
dc.subject.subareaBiochemistry-
dc.subject.subareaInstrumentation-
dc.subject.subareaElectrical and Electronic Engineering-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Kwon, Yong Jin Image
Kwon, Yong Jin권용진
Department of Industrial Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.