Ajou University repository

OmniStitch: Depth-Aware Stitching Framework for Omnidirectional Vision with Multiple Cameras
Citations

SCOPUS

0

Citation Export

DC Field Value Language
dc.contributor.authorKim, Sooho-
dc.contributor.authorHong, Soyeon-
dc.contributor.authorPark, Kyungsoo-
dc.contributor.authorCho, Hyunsouk-
dc.contributor.authorSohn, Kyung Ah-
dc.date.issued2024-10-28-
dc.identifier.urihttps://aurora.ajou.ac.kr/handle/2018.oak/37154-
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85209788020&origin=inward-
dc.description.abstractOmnidirectional vision systems provide a 360-degree panoramic view, enabling full environmental awareness in various fields, such as Advanced Driver Assistance Systems (ADAS) and Virtual Reality (VR). Existing omnidirectional stitching methods rely on a single specialized 360-degree camera. However, due to hardware limitations such as high mounting heights and blind spots, adapting these methods to vehicles of varying sizes and geometries is challenging. These challenges include limited generalizability due to the reliance on predefined stitching regions for fixed camera arrays, performance degradation from distance parallax leading to large depth differences, and the absence of suitable datasets with ground truth for multi-camera omnidirectional systems. To overcome these challenges, we propose a novel omnidirectional stitching framework and a publicly available dataset tailored for varying distance scenarios with multiple cameras. The framework, referred to as OmniStitch, consists of a Stitching Region Maximization (SRM) module for automatic adaptation to different vehicles with multiple cameras and a Depth-Aware Stitching (DAS) module to handle depth differences caused by distance parallax between cameras. In addition, we create and release an omnidirectional stitching dataset, called GV360, which provides ground truth images that maintain the perspective of the 360-degree FOV, designed explicitly for vehicle-agnostic systems. Extensive evaluations of this dataset demonstrate that our framework outperforms state-of-the-art stitching models, especially in handling varying distance parallax. The proposed dataset and code are publicly available in https://github.com/tngh5004/Omnistitch.-
dc.language.isoeng-
dc.publisherAssociation for Computing Machinery, Inc-
dc.subject.meshAdvanced driver assistances-
dc.subject.meshEnvironmental awareness-
dc.subject.meshGround truth-
dc.subject.meshImage stitching-
dc.subject.meshMultiple cameras-
dc.subject.meshOmni-directional view-
dc.subject.meshOmni-directional vision-
dc.subject.meshOmnidirectional view dataset-
dc.subject.meshOmnidirectional vision system-
dc.subject.meshPanoramic views-
dc.titleOmniStitch: Depth-Aware Stitching Framework for Omnidirectional Vision with Multiple Cameras-
dc.typeConference-
dc.citation.conferenceDate2024.10.28. ~ 2024.11.1.-
dc.citation.conferenceName32nd ACM International Conference on Multimedia, MM 2024-
dc.citation.editionMM 2024 - Proceedings of the 32nd ACM International Conference on Multimedia-
dc.citation.endPage10219-
dc.citation.startPage10210-
dc.citation.titleMM 2024 - Proceedings of the 32nd ACM International Conference on Multimedia-
dc.identifier.bibliographicCitationMM 2024 - Proceedings of the 32nd ACM International Conference on Multimedia, pp.10210-10219-
dc.identifier.doi10.1145/3664647.3681208-
dc.identifier.scopusid2-s2.0-85209788020-
dc.subject.keywordimage stitching-
dc.subject.keywordomnidirectional view dataset-
dc.subject.keywordomnidirectional vision-
dc.type.otherConference Paper-
dc.description.isoafalse-
dc.subject.subareaArtificial Intelligence-
dc.subject.subareaComputer Graphics and Computer-Aided Design-
dc.subject.subareaHuman-Computer Interaction-
dc.subject.subareaSoftware-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Cho, Hyunsouk Image
Cho, Hyunsouk조현석
Department of Software and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.