Linear Four-Point LiDAR SLAM for Manhattan World Environments
- Authors
- Jeong, Eunju; Lee, Jina; Kang, Suyoung; Kim, Pyojin
- Issue Date
- Nov-2023
- Publisher
- Institute of Electrical and Electronics Engineers Inc.
- Keywords
- 6-DOF; Cameras; Computer Vision for Transportation; Laser radar; Point cloud compression; RGB-D Perception; Sensor Fusion; Sensors; Simultaneous localization and mapping; Three-dimensional displays; Vision-Based Navigation
- Citation
- IEEE Robotics and Automation Letters, v.8, no.11, pp 7392 - 7399
- Pages
- 8
- Journal Title
- IEEE Robotics and Automation Letters
- Volume
- 8
- Number
- 11
- Start Page
- 7392
- End Page
- 7399
- URI
- https://scholarworks.sookmyung.ac.kr/handle/2020.sw.sookmyung/159480
- DOI
- 10.1109/LRA.2023.3315205
- ISSN
- 2377-3766
2377-3766
- Abstract
- We present a new SLAM algorithm that utilizes an inexpensive four-point LiDAR to supplement the limitations of the short-range and viewing angles of RGB-D cameras. Herein, the four-point LiDAR can detect distances up to 40 m, and it senses only four distance measurements per scan. In open spaces, RGB-D SLAM approaches, such as L-SLAM, fail to estimate robust 6-DoF camera poses due to the limitations of the RGB-D camera. We detect walls beyond the range of RGB-D cameras using four-point LiDAR; subsequently, we build a reliable global Manhattan world (MW) map while simultaneously estimating 6-DoF camera poses. By leveraging the structural regularities of indoor MW environments, we overcome the challenge of SLAM with sparse sensing owing to the four-point LiDARs. We expand the application range of L-SLAM while preserving its strong performance, even in low-textured environments, using the linear Kalman filter (KF) framework. Our experiments in various indoor MW spaces, including open spaces, demonstrate that the performance of the proposed method is comparable to that of other state-of-the-art SLAM methods.
- Files in This Item
-
Go to Link
- Appears in
Collections - 공과대학 > 기계시스템학부 > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.