Detailed Information

Cited 0 time in webofscience Cited 1 time in scopus
Metadata Downloads

Linear Four-Point LiDAR SLAM for Manhattan World Environments

Full metadata record
DC Field Value Language
dc.contributor.authorJeong, Eunju-
dc.contributor.authorLee, Jina-
dc.contributor.authorKang, Suyoung-
dc.contributor.authorKim, Pyojin-
dc.date.accessioned2023-12-19T04:01:30Z-
dc.date.available2023-12-19T04:01:30Z-
dc.date.issued2023-11-
dc.identifier.issn2377-3766-
dc.identifier.issn2377-3766-
dc.identifier.urihttps://scholarworks.sookmyung.ac.kr/handle/2020.sw.sookmyung/159480-
dc.description.abstractWe present a new SLAM algorithm that utilizes an inexpensive four-point LiDAR to supplement the limitations of the short-range and viewing angles of RGB-D cameras. Herein, the four-point LiDAR can detect distances up to 40 m, and it senses only four distance measurements per scan. In open spaces, RGB-D SLAM approaches, such as L-SLAM, fail to estimate robust 6-DoF camera poses due to the limitations of the RGB-D camera. We detect walls beyond the range of RGB-D cameras using four-point LiDAR; subsequently, we build a reliable global Manhattan world (MW) map while simultaneously estimating 6-DoF camera poses. By leveraging the structural regularities of indoor MW environments, we overcome the challenge of SLAM with sparse sensing owing to the four-point LiDARs. We expand the application range of L-SLAM while preserving its strong performance, even in low-textured environments, using the linear Kalman filter (KF) framework. Our experiments in various indoor MW spaces, including open spaces, demonstrate that the performance of the proposed method is comparable to that of other state-of-the-art SLAM methods.-
dc.format.extent8-
dc.language영어-
dc.language.isoENG-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleLinear Four-Point LiDAR SLAM for Manhattan World Environments-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1109/LRA.2023.3315205-
dc.identifier.scopusid2-s2.0-85171593120-
dc.identifier.wosid001081553900005-
dc.identifier.bibliographicCitationIEEE Robotics and Automation Letters, v.8, no.11, pp 7392 - 7399-
dc.citation.titleIEEE Robotics and Automation Letters-
dc.citation.volume8-
dc.citation.number11-
dc.citation.startPage7392-
dc.citation.endPage7399-
dc.type.docTypeArticle-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaRobotics-
dc.relation.journalWebOfScienceCategoryRobotics-
dc.subject.keywordAuthor6-DOF-
dc.subject.keywordAuthorCameras-
dc.subject.keywordAuthorComputer Vision for Transportation-
dc.subject.keywordAuthorLaser radar-
dc.subject.keywordAuthorPoint cloud compression-
dc.subject.keywordAuthorRGB-D Perception-
dc.subject.keywordAuthorSensor Fusion-
dc.subject.keywordAuthorSensors-
dc.subject.keywordAuthorSimultaneous localization and mapping-
dc.subject.keywordAuthorThree-dimensional displays-
dc.subject.keywordAuthorVision-Based Navigation-
dc.identifier.urlhttps://ieeexplore.ieee.org/document/10250905-
Files in This Item
Go to Link
Appears in
Collections
공과대학 > 기계시스템학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE