Robust Visual Localization in Changing Lighting Conditions
- Authors
- 김표진; Brian Coltin; Oleg Alexandrov; 김현진
- Issue Date
- Jul-2017
- Publisher
- Institute of Electrical and Electronics Engineers Inc.
- Citation
- Proceedings - IEEE International Conference on Robotics and Automation, pp 5447 - 5452
- Pages
- 6
- Journal Title
- Proceedings - IEEE International Conference on Robotics and Automation
- Start Page
- 5447
- End Page
- 5452
- URI
- https://scholarworks.sookmyung.ac.kr/handle/2020.sw.sookmyung/151353
- DOI
- 10.1109/ICRA.2017.7989640
- ISSN
- 1050-4729
- Abstract
- We present an illumination-robust visual localization algorithm for Astrobee, a free-flying robot designed to autonomously navigate on the International Space Station (ISS). Astrobee localizes with a monocular camera and a pre-built sparse map composed of natural visual features. Astrobee must perform tasks not only during the day, but also at night when the ISS lights are dimmed. However, the localization performance degrades when the observed lighting conditions differ from the conditions when the sparse map was built. We investigate and quantify the effect of lighting variations on visual feature-based localization systems, and discover that maps built in darker conditions can also be effective in bright conditions, but the reverse is not true. We extend Astrobee's localization algorithm to make it more robust to changing-light environments on the ISS by automatically recognizing the current illumination level, and selecting an appropriate map and camera exposure time. We extensively evaluate the proposed algorithm through experiments on Astrobee.
- Files in This Item
-
Go to Link
- Appears in
Collections - 공과대학 > 기계시스템학부 > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.