Background subtraction with neighbor-based intensity correction algorithm
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Huynh-The T. | - |
dc.contributor.author | Banos O. | - |
dc.contributor.author | Le B.-V. | - |
dc.contributor.author | Bui D.-M. | - |
dc.contributor.author | Lee S. | - |
dc.contributor.author | Yoon Y. | - |
dc.contributor.author | Le-Tien T. | - |
dc.date.available | 2021-02-22T11:30:20Z | - |
dc.date.issued | 2015-10 | - |
dc.identifier.issn | 2162-1020 | - |
dc.identifier.uri | https://scholarworks.sookmyung.ac.kr/handle/2020.sw.sookmyung/9984 | - |
dc.description.abstract | An efficient foreground detection algorithm is presented in this work to be robust against consecutively illuminance changes and noise, and adaptive with dynamic speeds of motion in the background. The scene background is firstly modeled by a novel algorithm, namely Neighbor-based Intensity Correction, which identifies and modifies motion pixels extracted from the difference of the background and the current frame. Concretely the first frame is assumed as an initial background to be updated at each new coming frame based on the mechanism of the standard deviation value comparison. Two pixel windows used for standard deviation calculation are generated surrounding a corresponding motion pixel from the background and the current frame. The steadiness of the current background at the pixel-level is measured by a constantly updating factor to decide the usage of the algorithm or not. In the next stage, the foreground of the current frame are detected by the background subtraction scheme with an optimal Otsu threshold. This method is evaluated on various well-known datasets in the object detection and tracking area and then compared with recent approaches via some common quantitative measurements. From experimental results, the proposed method achieves the better results (approximately 5-20%) in term of the foreground detection accuracy. © 2015 IEEE. | - |
dc.format.extent | 6 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | IEEE Computer Society | - |
dc.title | Background subtraction with neighbor-based intensity correction algorithm | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.1109/ATC.2015.7388331 | - |
dc.identifier.scopusid | 2-s2.0-84961772432 | - |
dc.identifier.bibliographicCitation | 2015 International Conference on Advanced Technologies for Communications (ATC), v.2016-January, pp 26 - 31 | - |
dc.citation.title | 2015 International Conference on Advanced Technologies for Communications (ATC) | - |
dc.citation.volume | 2016-January | - |
dc.citation.startPage | 26 | - |
dc.citation.endPage | 31 | - |
dc.type.docType | Conference Paper | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordPlus | Electronic mail | - |
dc.subject.keywordPlus | Estimation | - |
dc.subject.keywordPlus | Object detection | - |
dc.subject.keywordPlus | Robustness (control systems) | - |
dc.subject.keywordPlus | Silicon | - |
dc.subject.keywordPlus | Standards | - |
dc.subject.keywordPlus | Statistics | - |
dc.subject.keywordPlus | Adaptation models | - |
dc.subject.keywordPlus | Background subtraction | - |
dc.subject.keywordPlus | Correction algorithms | - |
dc.subject.keywordPlus | Foreground detection | - |
dc.subject.keywordPlus | Object detection and tracking | - |
dc.subject.keywordPlus | Quantitative measurement | - |
dc.subject.keywordPlus | Standard deviation | - |
dc.subject.keywordPlus | Subtraction techniques | - |
dc.subject.keywordPlus | Pixels | - |
dc.subject.keywordAuthor | Adaptation models | - |
dc.subject.keywordAuthor | Electronic mail | - |
dc.subject.keywordAuthor | Estimation | - |
dc.subject.keywordAuthor | Robustness | - |
dc.subject.keywordAuthor | Silicon | - |
dc.subject.keywordAuthor | Standards | - |
dc.subject.keywordAuthor | Subtraction techniques | - |
dc.identifier.url | https://ieeexplore.ieee.org/document/7388331 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
Sookmyung Women's University. Cheongpa-ro 47-gil 100 (Cheongpa-dong 2ga), Yongsan-gu, Seoul, 04310, Korea02-710-9127
Copyright©Sookmyung Women's University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.